Nonstandard Reasoning Johan van Benthem, University of Amsterdam, Amsterdam, The Netherlands Ó 2015 Elsevier Ltd. All rights reserved.
Abstract In this article, nonstandard reasoning refers to the proliferation of reasoning styles investigated in modern logic beyond its traditional agenda. After a brief statement of standard logical approaches to consequence, we describe motivations for new systems. These include not only inference patterns with special vocabulary from mathematics, philosophy, and linguistics but also new styles of reasoning coming from computer science and artificial intelligence. The resulting landscape is diverse, but we discuss unifying themes such as structural rules, preferences, resources, information, and architecture of logical systems. Many of these reflect recent cognitive trends in modern logic putting ‘social dynamics’ at center stage: reasoning about one’s own information and that of others, information update, acts of communication, processes of inquiry, and games.
The Emergence of Nonstandard Logics Logic since antiquity has been concerned with ‘the laws of thought,’ supposedly unique and immutable principles of valid reasoning from premises to conclusions. Gradually, over the twentieth century, this monolithic view has changed to a pluralist one where logic tries to chart the different human reasoning styles and their matching logical systems. With the emergence of information, computation, and cognition as central scientific themes in recent decades, new items have kept appearing on this broader agenda. The resulting systems are often called ‘nonstandard logics,’ different from the established regime, but this label is relative. The background to current broader studies of reasoning are still the major approaches to valid inference in standard logic (semantic or proof-theoretic) developed by reflecting on philosophical argument and mathematical proof. Over time, however, these approaches have acquired richer vocabularies and additional notions of inference, whose motivation comes from philosophy, mathematics, linguistics, and many other disciplines. In particular, nonstandard logics often originate in computer science and artificial intelligence when thinking about key patterns in action, problem solving, or planning: arguably human core skills where reasoning may in fact have developed first in our evolutionary history. The resulting landscape of systems is diverse, but there are unifying themes such as structural rules of inference, information structure, preferences, resources, knowledge, or architectures for combining systems. These themes also reflect what might be called more realistic cognitive trends in modern logic: reasoning about one’s own information and that of others, processes for information update and inquiry, communicative acts such as questions and answers, games and rational strategic interaction, and in all these, the use of heterogeneous information from symbolic, graphic, and other sources. In other words, the somewhat negative term ‘nonstandard’ will be understood in this article as natural agenda extensions of logic that go beyond the traditional core concerns.
Standard Logic: Valid Consequence, Proof Calculi, Completeness The term ‘standard logic’ is mostly used to refer to the achievements of the renaissance of formal logic, starting with
924
Boole, Frege, and Peirce in the nineteenth century, and culminating in the work of Gödel, Tarski, Turing, and others in the 1930s. The resulting core of the field consists of propositional logic (codifying reasoning about classifying situations using sentence combinations with the key operations ‘not,’ ‘and,’ and ‘or’) and predicate logic of quantifiers (‘for all’ and ‘there exists’) expressing statements about the internal structure of complex domains with objects and predicates. Together, these systems can describe a good deal of mathematical reasoning, and even quite a few core patterns in ordinary reasoning. Not surprisingly then, they have long been the standard core of most textbook presentations of modern logic. The two systems share what seems to be the most widely accepted semantic notion of valid consequence (due to Bolzano and Tarski) requiring ‘transmission of truth’ from premises P to a conclusion C:
Each situation in which all the premises P are true also makes the conclusion C true.
To describe and produce valid inferences, one can sometimes use simple mechanical methods, such as the truth tables of propositional logic. While truth tables are simple, and quite suggestive in assimilating reasoning to computation – a perennial theme in logic – they may promise more than can be delivered for logic in general. More general, and at the same time more relevant to capturing the undeniably stepwise structure of actual reasoning, are proof-theoretic approaches, deriving conclusions C from premises P via a system of rules:
Conclusion C follows from P if it can be derived by legitimate proof steps.
Simple steps staying within the reach of human understanding (not too easy, not too hard) are evident not only in mathematical proof but also in common sense reasoning. Proof systems of various sorts have turned out to have an additional benefit: they drive a wide variety of mechanized reasoning systems in artificial intelligence and computational logic. In addition to system building, logic is also about system theory. Perhaps the first significant insight into logical systems
International Encyclopedia of the Social & Behavioral Sciences, 2nd edition, Volume 16
http://dx.doi.org/10.1016/B978-0-08-097086-8.43070-9
Nonstandard Reasoning
per se concerns the harmony between the preceding two perspectives. The semantic approach to valid consequence based on truth and the syntactic one based on proof coincide for quantificational logic: this is the content of Gödel’s celebrated completeness theorem, which has spawned many similar results for other kinds of reasoning. Another key result is Gödel’s even more famous ‘incompleteness theorem’ showing that formalized mathematical theories going beyond pure logic by adding arithmetical notions are nonaxiomatizable. One of the many consequences of the latter result is Church’s theorem saying that no mechanical algorithm can decide validity in quantificational logic. Both positive and negative results have fueled progress in logic. For instance, one major source of variety in logical theories of inference is the search for decidable systems that escape Church’s result by judiciously restricting the expressive power of the language of quantification.
Extending the Vocabulary: From Mathematics to Natural Language Logical systems use ‘logical forms’ with fixed operators for key notions (Booleans, quantifiers), plus variable parts for expressions that can be interpreted freely (a typical valid form is ‘from premises A-or-B, and not-A, infer conclusion B0 ). Propositional and predicate logic have been successful in analyzing mathematical proof, but gradually extensions had to be made to deal with the kind of argument found in philosophy, linguistics, and in the latter part of the twentieth century, computer science and cognitive science. For example, logical semantics of natural language deals with the much richer repertoire that we have for stating and communicating what the world is like according to us. This repertoire includes generalized quantifiers (‘almost all,’ ‘most,’ ‘many,’ and ‘few’) beyond the standard logical notions of ‘all’ and ‘some,’ plus a wide range of so-called modal expressions for our natural daily reasoning involving possibility, time, conditionality, obligation, knowledge, belief, action, etc. – often for both single and multiple agents. The extra vocabulary taken on board here may be considered ‘nonstandard,’ and determining the additional valid inferences supported by it may be subtle – but the canons of defining validity remain the earlier classical ones.
Alternative Views of Validity There are also genuine deviations from the classical notion of valid consequence. A recurrent historical example is the view that conclusions only follow validly from consistent premises. Then, nothing follows from a contradiction A-and-not-A, while on the ‘standard’ account, everything does. However, this is just one of many hotspots. Alternative views of reasoning may also arise from studying the common vocabulary in human practice that often goes beyond the usual sentence connectives and quantifiers. Alternatives may also have conceptual motivations in philosophy and science.
Deviations in Natural Language, Cognition, and Philosophy The idea that ‘natural reasoning’ in our ordinary language differs from standard logical calculi is a recurrent theme in the
925
twentieth century. One hotspot for variations is the crucial logical notion of implication, represented by several expressions in natural language, with ‘if.then.’ as a ubiquitous example. It seems clear that natural implication does not behave like the truth-functional Boolean conditional – as shown by the ‘paradoxes of material implication.’ This has led to many deviant calculi not only for ‘modal’ or ‘relevant’ implication but also for counterfactual conditionals stating what would have been the case if certain things that are true now had not been the case. These are not just small variations. For instance, counterfactual reasoning keeps much of our thinking about what we do in the actual world ‘in place,’ and it represents a typical human skill. Some of these phenomena are also put into focus in cognitive psychology, where actual human reasoning behavior appears to be much richer than the repertoire that the standard logical canon has catered for. Finally, in addition to these empirical sources of diversity, there is also the influence of philosophical reflection on reasons, evidence, and knowledge or belief based on these. In contemporary epistemology, sophisticated new dynamic accounts of knowledge in terms of tracking the truth over time, or of constructing and maintaining spaces of relevant alternative worlds, suggest interesting new notions of reasoning that lack many of the basic features of classical systems.
Deviations in Scientific Reasoning Other modulations on valid reasoning arise in the foundations of science. Starting in the heartland of mathematical logic, ‘constructivism’ in mathematics views proofs as constructions of mental entities, whose properties must be established by computational means. Accordingly, intuitionistic logic accepts only those inferences as valid that have a constructive proof. For instance, intuitionistically, one cannot prove an assertion ‘there exists an x such that F(x)0 without producing an explicit example, while classical logic would also accept an argument through the back door, namely, a refutation of the proposition that no object x has the property F. Intuitionistic logic is generalized in a wide family of constructive ‘type theories’ whose validities do not coincide with those of classical logical systems. However, logical proof theory continues to produce new perspectives on valid reasoning. A radical example from the 1980s is linear logic, which reinterprets the propositions involved in inference as information pieces or computational ‘resources’ that can be used only once. Alternative logics have also emerged in physics. Quantum mechanics has inspired quantum logics with failures of Boolean distributivity laws in reasoning about the location or momentum of particles. (Whether this has to be, or whether classical redescription is possible after all is one of the deep issues in the field.) Most of all, reflecting the close analogies between ‘reasoning and reckoning’ (as Hobbs put it in the seventeenth century when these views emerged), computer science has inspired many new systems of reasoning. Examples are dynamic logics describing the effects of computer programs and of actions generally, logic programming (using proof systems as computation procedures), description logics (well-behaved weak systems for data bases and knowledge representation), and a host of new systems discovered in artificial intelligence when engaging in the study of ‘commonsense reasoning.’
926
Nonstandard Reasoning
The result of all these developments for logic today is a wide range of alternatives to classical systems, developed with equal rigor, but different in their valid principles, general philosophy, and mathematical properties.
Varieties of Inference That ‘natural reasoning’ comes in different styles, with possibly different logical features, was already observed by Bolzano in the early nineteenth century. Scientific reasoning employs stricter standards than common sense (although Bolzano felt that philosophical reasoning was the most demanding skill of all), and common sense as encoded in natural language itself has many varieties, that may even be task dependent. The same variety also occurs in artificial intelligence with problem solving or planning (deductive or heuristic), where systems are more ‘optimistic’ than standard logic, representing a typical human ability of being able to jump to conclusions, a rich inferential power kept in check by our talent for self-correction once our beliefs turn out wrong. Indeed, more generally, reasoning cannot be seen in isolation from other cognitive skills. In the following subsections we give a few examples.
Nonmonotonic Logics Practical reasoning uses defaults going beyond standard logical conclusions. Thus, we use some default rule (‘Dutch trains run on time’) in planning our escape home from a futuristic campus unless counterexamples are explicitly indicated, or, in judging our friends in a cold climate, we take an answer ‘John, Mary’ to the question ‘Who were skating?’ as conveying not just that John and Mary skated, but that no others did, assuming a maximally informative respondent. Crucially, such additional inferences may have to be withdrawn later, because new relevant information comes in: my train may be carrying soccer fans, the person answering my question may turn out to be a government employee. Reasoning in which a conclusion C may follow from premises P, but no longer from the richer set P þ Q is called ‘nonmonotonic’ – while classical consequence is monotonic: once drawn, conclusions persist. A general pattern behind much nonmonotonic reasoning is ‘circumscription,’ which, in its most abstract form, assumes that situations come ordered by some binary relation of preference, and then makes C follow from P if C holds in all most preferred situations where P holds. In the escapist example, the most preferred situations (for the time being) are those where trains run according to schedule, while those containing a disturbing factor are less preferred. This idea comes from artificial intelligence and philosophical logic, and it has a very wide sweep. Preferences may reflect genuine preference, but just as well notions of plausibility, utility, relevance, or simplicity, depending on circumstances. Classical consequence is not in conflict with this view of reasoning. It is still present here as the special case when no models are preferred to others, making all models for the premises most preferred. Still, there is an intuitive difference, which comes to light in a richer picture of what is happening when these different reasoning styles are deployed. Classical reasoning is safe, and thus, it may be associated with
knowledge, and ways of unpacking it. By contrast, nonmonotonic reasoning goes naturally with processes of forming and modifying beliefs, a more volatile attitude than knowledge, but one that is essential to practice. The tandem of inference and belief revision is a crucial practical skill, reflecting two aspects of what it means to ‘learn.’
Directions and Purposes in Reasoning Our presentation so far viewed reasoning as a forwardoriented process of accumulating conclusions from given premises. In addition to this forward mode, there are others. This is already clear with standard uses of classical reasoning for refuting, rather than confirming, opinions or even whole theories. Directions multiply when we consider the rich view of reasoning developed by Peirce around 1900, who distinguished deduction, induction, and abduction. Abduction is the common explanatory reasoning that we engage in when seeking plausible premises to support given observations, or in making sense of what we are told. It provides premises for conclusions, rather than the other way around. In contrast with this, induction looks for plausible generalizations beyond the immediate content of the premises, raising again the issue of learning scenarios, now for general facts going beyond specific observations. These different directions and purposes all matter when analyzing activities such as ‘refutation,’ ‘explanation,’ or ‘confirmation’ in the philosophy of science – but it is also easy to find counterparts for all of them in our daily practice. Moreover, like standard logic, these broader views of reasoning have found their way into computer science. Abduction has been studied in connection with logic programming; varieties of belief revision and induction occur in computational learning theories. In recent years, many of these systems have also been related to cognitive science. For instance, cognitive experiments with conditional reasoning point at nonmonotonic phenomena in actual behavior, and it has even been shown that neural networks in the brain can be represented in terms of nonmonotonic logics with inferences repeated over time.
Reasoning with Uncertainty and Probability Induction leads naturally to the foundations of uncertainty and probability. Probabilistic reasoning seems a major mode of cognition in its own right. Its standard formalism is definitely not logic, but probability theory. However, starting from Carnap’s ‘inductive logic’ in the 1950s, there is an increasing literature on interfaces between logical and probabilistic reasoning. These include more qualitative styles of probabilistic reasoning in ‘fuzzy logic’ and ‘possibility theory,’ and natural hybrids of standard logical and probabilistic systems. Moreover, statistical long-term bulk behavior is coming to the fore as a topic of study, and here ‘inference statistics’ in terms of percentages of truth and falsity across models, or records of proof attempts, meets logic proper. Finally, qualitative versions of fundamental probabilistic notions such as randomness and independence have been migrating into logic. In particular, new versions of standard predicate logic have attracted attention recently,
Nonstandard Reasoning
allowing for patterns of dependence and independence between variables that go far beyond what has been studied in existing logical systems.
Logical Dynamics: Reasoning as an Activity Under the influence of ideas from philosophy, computer science, and cognitive science, many logicians have started viewing reasoning as a cognitive process whose dynamics must, and can, be brought out by logical means. This process view of logic is closer to actual argumentation, which is driven by local acts of inferential commitment, while longer term timing and procedural conventions are also crucial to the outcomes.
Dynamic Logics of Information Flow In this dynamic perspective, reasoning is one of many cognitive tasks that can be studied together. As an early signal of this trend, already the Mohist logicians in ancient China thought that inference, observation, and communication are the three great natural sources of information that should work together. This interplay of different information sources can be observed in the empirical sciences, but just as well in daily practice. Modern logical systems merging inference with a broader information dynamics include dynamic epistemic logics that combine acts of inference, observation, and questions, and axiomatize the laws that describe which new information is generated in each step. The relevant information states can be those of individual agents, but given the social interactive character of much reasoning and related informational activities, these logics often work over multiagent models for different agents having different information. A further source of diversity here, beyond the dynamic repertoire of actions and events, is the notion of information at work. Inquiry is tied to the usual semantic notion of information as a range of current options that shrinks when new information comes in. However, other powerful intuitions exist, such as the idea that information flows because of correlations or dependencies between different objects or situations that are brought together in reasoning. There is even a third notion. Valid inferences do not shrink a semantic range of options: they rather elucidate a more fine-grained syntactic notion of information, on whose nature there is no consensus at present. The precise sense of information in logic has itself become a topic of discussion and a source of variety.
Dynamic Semantics of Natural Language Another take on information dynamics is in terms of the meaning of the very expressions used in reasoning. A modern program is dynamic semantics, a view of the linguistic meaning of a proposition as the update of a hearer’s information state that is produced by it. Thus, propositions become cognitive procedures, and natural language now resembles a programming language modifying information states of agents. Dynamic semantics does not stop at reinterpreting vocabulary. It has also been a source of new notions of inference and consequence. One natural dynamic notion of valid
927
consequence from premises P to a conclusion C is the following ‘fixed-point’ view:
In final information states reached by processing the successive premises P, updating with the conclusion C will not have any further effect.
Dynamic semantics exist for a broad variety of speech acts associated with reasoning and information flow, including questions, answers, suggestions, or commands. Again, multiagent perspectives come to the fore, as speech acts make little sense in solitude.
Belief Revision, Learning, and Social Information Dynamics A broader perspective on such cognitive actions arises in belief revision theory, which studies updates adding information to the current state, contractions that remove information, and revisions doing both – doing justice to the ‘zigzag character’ of much of our reasoning. We have already noted how acts of revision and self-correction form a natural complement to the somewhat risky nature of nonmonotonic reasoning. Forming hypotheses and learning from errors is also crucial to learning theory, which may be considered an extension of information dynamics to a longer term process horizon. Belief revision and learning may also come in social flavors, since one of the most common causes for belief change is something said by someone else. However, groups of agents also bring their own dynamics, such as the phenomenon of ‘belief merge’ between different groups. Such phenomena have been studied for computational agents, and recently also in social philosophy and social choice theory, since votes are only stages in some larger setting of communication and deliberation, i.e., social reasoning.
Logic and Games Perhaps the most ‘activist’ perspective on reasoning is one borrowing some major concepts from game theory. Many basic logical notions can be cast in the form of games. Argumentation is a key example, but the same is true for evaluating assertions between parties disagreeing on the truth value of a given statement, and many other logical tasks. By now, ‘logic games’ are a well-known technique, even in textbooks, for teaching logical basics. In particular, logical ‘dialog games’ suggest alternative views of valid reasoning in terms of winning and losing argumentation games, complementing the earlier views in terms of proof and semantics. The dialogical take on valid consequence employs the fundamental game-theoretic notion of a strategy, a response pattern for a player to any sequence of moves by one’s opponents:
In fair rational debate, the proponent of the conclusion C has a winning strategy for upholding C against any play by an opponent granting the premises P.
Logic games tie the study of reasoning to that of rational behavior in general. This is one of several places where logic interfaces with game theory, finding shared concerns in the
928
Nonstandard Reasoning
foundations of social action. The logic and games connection has led to new links of logic not only with philosophy but also with practical fields like argumentation theory.
Systematic Theory of Reasoning With the rapid growth of nonstandard logics, the unity of classical logic may seem to dissolve. Even so, at a theoretical metalevel, unity may still prevail, be it in new more general guises – rather than the chanting of particular canonical laws of consequence. The usual mathematical notions and techniques of logic still work in today’s broader realm, in charting possible systems, and exploring general trends. It is too early for a new synthesis of ‘standard’ and ‘nonstandard,’ but some unifying themes are noticeable.
Structural Rules The above-discussed notion of nonmonotonicity is a hallmark of many nonstandard reasoning systems – although it is a symptom, not a diagnosis of underlying causes. However, there is a natural abstraction level here. In recent decades, it has turned out useful to classify styles of reasoning by their ‘structural rules,’ such as monotonicity and transitivity (‘if P implies Q, and Q implies R, then P implies R0 ), and others, which move premises around, or duplicate them when needed. The structural rules of classical logic treat premises as an unordered set, whose presentation is irrelevant. By contrast, nonstandard reasoning tends to be more sensitive to the formulation of data, and the corresponding actions, which shows in failures of classical structural rules. Thus, in dynamic semantics, permuting two premises may change the update achieved – reflecting the typical order dependence of natural language. By now complete sets of structural rules have been found for many nonstandard reasoning practices, highlighting what makes them special. This variety has now also reached the philosophy of logic, where ‘logical pluralism’ is the position that this diversity is legitimate.
New Vocabulary Reasoning and language are intimately connected. When an inferential practice changes, there may be a concomitant change in how we view the language where it all takes place. This language change can happen in two broad varieties. One is extension of classical languages. For instance, current dynamic logics of knowledge update or belief revision tend to add new logical operators of ‘learning’ connected to information changes, and then study their laws as an extension of existing logical systems. However, a more radical line is also noticeable in the current literature. A nonstandard notion of reasoning may lead to reinterpretation of the existing traditional logical operations. Thus, in logic games, a disjunction is a ‘choice’ of a move by the defending player, while a negation signals a ‘role switch’ between two players. Nonstandard reasoning may also involve new operations, outside the classical core. In particular, with games, there is not one conjunction, as in Boolean logic, but three: choosing which game to play at the start, playing one game after another, or playing two games concurrently. All
these support logical laws for ‘conjunction’, that are related, but also part company in crucial aspects. Similar proliferation phenomena can be observed with the earlier mentioned resource-oriented proof-theoretic logics, from intuitionistic to linear systems. Thus modern calculi of reasoning can be much richer in vocabulary than classical logical systems: broader reasoning means richer language. Again, theory is lagging behind agenda expansion here. The two strategies language extension and language reinterpretation seem to be connected in various ways. However, so far, we have only bits and pieces of translation results, and no general picture.
Mechanisms and Structures More important than individual logical laws, deviant or not, are general mechanisms behind nonstandard reasoning systems. Striking unifying notions mentioned before included ordering relevant situations by various kinds of preference relations, fine-grained computational views of handling inferential resources, and natural repertoires of dynamic informational actions and processes. These notions provide a much richer picture of actual reasoning, and ‘parameters’ that can be set for different purposes. Other important general trends go beyond the single-sentence habitat of standard logic. One of these is the temporal longer term. In addition to the usual local semantic ‘meaning’ of single inferences, acts of observation, or questions, there is the art of ‘making sense’ of these events. This art requires the setting of some longer term informational process, say a conversation, a process of inquiry, or a learning scenario. This was clear, for instance, with the game-theoretic view of validity that was discussed earlier: a strategy is a recipe for long-term behavior, guaranteeing certain outcomes in the long run. These temporal strategic aspects of reasoning are slowly coming to the fore in the modern literature, in the form of new research lines in between logic, process theories in computer science, and game theory. Another large-scale aspect of reasoning that is ignored in a simple-minded premise-conclusion format is architecture of representation and processing. How are the data that reasoning has to work with usefully structured? How do different styles of reasoning cooperate to achieve useful effects? How do symbolic and nonsymbolic data (say visual, or just observing movement) cooperate in reasoning? One counterpart to the temporal long term here is spatial bulk. Reasoning takes place on the basis of experience, i.e., a vast reservoir of earlier reasoning tasks stored in some way in our memory (perhaps compressed into some sort of retrievable probabilistic version). How can we describe this, and what might be emergent statistical phenomena here going beyond our traditional idea of a logical system? Again, the earliermentioned interface between logic and probability becomes relevant here. Finally, a pervasive theme in a broader understanding of reasoning is its social nature. Much reasoning is a many-mind activity involving agents that communicate, cogitate, and act – and the repercussions of this view of ‘intelligent interaction’ are slowly unfolding in current research on multiagent logics, computation, and games.
Nonstandard Reasoning
A Triangle of Disciplines One way of looking at the nonstandard reasoning themes presented here is as a shift in character of logical theory, from purely normative to more descriptive. Now this may not be quite the right perspective: nonmonotonic logics can well be normative, and so can dynamic logics of observation, telling us what we should get out of what we see. However, what is true is that a natural triangle of disciplines is involved in most of the themes in this article. Some tend more toward theory (logic, philosophy, and mathematics), others are more driven by empirical data (linguistics or cognitive science), and interestingly, there was also a crucial third category of disciplines that are not easily classified as either normative or descriptive, such as computer science. Drawing sharp distinctions between the three kinds of discipline seems increasingly irrelevant in the study of reasoning. Coda. Despite all this legitimate interdisciplinary variety at the level of cognitive activities, there may still be a privileged role for standard logic at another level. Significantly, in the metalevel system building about all these activities, logicians tend to follow classical methods from the field, since these have set standards of rigor and perspicuity that no one wants to give up. In this sense, there is much more continuity in the field than might appear from this article. The standard modus operandi is there to serve all.
Conclusion Modern logic arose out of reflection on proof in mathematics and argumentation in philosophy, and created the standard systems of the twenty-first century. Modern studies of reasoning, while retaining the methods of this phase, deal with a much
929
broader spectrum of laws, structures, and processes, than the repertoire of a standard logic textbook. Moreover, in doing so, they interface in new ways with disciplines such as philosophy, linguistics, computer science, cognitive science, game theory, argumentation theory, legal studies, and others. In this light, ‘nonstandard reasoning’ signals nothing marginal or subversive, except for insisting on this broader horizon.
See also: Axiomatic Theories; Deductive Reasoning Systems; Game Theory; Knowledge Representation; Logics for Knowledge Representation; Mathematical Models in Philosophy of Science; Scientific Discovery, Computational Models of.
Further Reading Abramsky, S., Gabbay, D., Maibaum, T. (Eds.), 1992–2000. Handbook of Logic in Theoretical Computer Science. Oxford University Press, Oxford, UK. Adriaans, P., van Benthem, J. (Eds.), 2008. Handbook of the Philosophy of Information. Elsevier, Amsterdam, The Netherlands. Barwise, J. (Ed.), 1977. Handbook of Mathematical Logic. North-Holland, Amsterdam. Gabbay, D., Guenthner, F. (Eds.), 1983–1988. Handbook of Philosophical Logic. Reidel, Dordrecht, The Netherlands; Revised and expanded, 2001. Kluwer, Dordrecht, The Netherlands. Gabbay, D., Hogger, C., Robinson, J. (Eds.), 1993–1998. Handbook of Logic in Artificial Intelligence and Logic Programming. Oxford University Press, Oxford, UK. Gabbay, D., Woods, J. (Eds.), 2002. Handbook of the Logic of Argument and Inference: The Turn towards the Practical. Elsevier, Amsterdam, The Netherlands. Elsevier, Amsterdam; Revised and expanded edition 2010 Van Benthem, J., ter Meulen, A. (Eds.), Handbook of Logic and Language. Elsevier Insights, Amsterdam, The Netherlands. Van Benthem, J., Gupta, A. (Eds.), 2011. Logic and Philosophy Today. College Publications, London.