
- •CONTENTS
- •PREFACE
- •LIST OF FIGURES
- •INTRODUCTION
- •1.1 WHAT IS TIME?
- •1.2 SIMULATION
- •1.3 TESTING
- •1.4 VERIFICATION
- •1.6 USEFUL RESOURCES
- •2.1 SYMBOLIC LOGIC
- •2.1.1 Propositional Logic
- •2.1.2 Predicate Logic
- •2.2 AUTOMATA AND LANGUAGES
- •2.2.1 Languages and Their Representations
- •2.2.2 Finite Automata
- •2.3 HISTORICAL PERSPECTIVE AND RELATED WORK
- •2.4 SUMMARY
- •EXERCISES
- •3.1 DETERMINING COMPUTATION TIME
- •3.2 UNIPROCESSOR SCHEDULING
- •3.2.1 Scheduling Preemptable and Independent Tasks
- •3.2.2 Scheduling Nonpreemptable Tasks
- •3.2.3 Nonpreemptable Tasks with Precedence Constraints
- •3.2.5 Periodic Tasks with Critical Sections: Kernelized Monitor Model
- •3.3 MULTIPROCESSOR SCHEDULING
- •3.3.1 Schedule Representations
- •3.3.3 Scheduling Periodic Tasks
- •3.4 AVAILABLE SCHEDULING TOOLS
- •3.4.2 PerfoRMAx
- •3.4.3 TimeWiz
- •3.6 HISTORICAL PERSPECTIVE AND RELATED WORK
- •3.7 SUMMARY
- •EXERCISES
- •4.1 SYSTEM SPECIFICATION
- •4.2.1 Analysis Complexity
- •4.3 EXTENSIONS TO CTL
- •4.4 APPLICATIONS
- •4.4.1 Analysis Example
- •4.5 COMPLETE CTL MODEL CHECKER IN C
- •4.6 SYMBOLIC MODEL CHECKING
- •4.6.1 Binary Decision Diagrams
- •4.6.2 Symbolic Model Checker
- •4.7.1 Minimum and Maximum Delays
- •4.7.2 Minimum and Maximum Number of Condition Occurrences
- •4.8 AVAILABLE TOOLS
- •4.9 HISTORICAL PERSPECTIVE AND RELATED WORK
- •4.10 SUMMARY
- •EXERCISES
- •VISUAL FORMALISM, STATECHARTS, AND STATEMATE
- •5.1 STATECHARTS
- •5.1.1 Basic Statecharts Features
- •5.1.2 Semantics
- •5.4 STATEMATE
- •5.4.1 Forms Language
- •5.4.2 Information Retrieval and Documentation
- •5.4.3 Code Executions and Analysis
- •5.5 AVAILABLE TOOLS
- •5.6 HISTORICAL PERSPECTIVE AND RELATED WORK
- •5.7 SUMMARY
- •EXERCISES
- •6.1 SPECIFICATION AND SAFETY ASSERTIONS
- •6.4 RESTRICTED RTL FORMULAS
- •6.4.1 Graph Construction
- •6.5 CHECKING FOR UNSATISFIABILITY
- •6.6 EFFICIENT UNSATISFIABILITY CHECK
- •6.6.1 Analysis Complexity and Optimization
- •6.7.2 Timing Properties
- •6.7.3 Timing and Safety Analysis Using RTL
- •6.7.5 RTL Representation Converted to Presburger Arithmetic
- •6.7.6 Constraint Graph Analysis
- •6.8 MODECHART SPECIFICATION LANGUAGE
- •6.8.1 Modes
- •6.8.2 Transitions
- •6.9.1 System Computations
- •6.9.2 Computation Graph
- •6.9.3 Timing Properties
- •6.9.4 Minimum and Maximum Distance Between Endpoints
- •6.9.5 Exclusion and Inclusion of Endpoint and Interval
- •6.10 AVAILABLE TOOLS
- •6.11 HISTORICAL PERSPECTIVE AND RELATED WORK
- •6.12 SUMMARY
- •EXERCISES
- •7.1.1 Timed Executions
- •7.1.2 Timed Traces
- •7.1.3 Composition of Timed Automata
- •7.1.4 MMT Automata
- •7.1.6 Proving Time Bounds with Simulations
- •7.2.1 Untimed Traces
- •7.2.2 Timed Traces
- •7.3.1 Clock Regions
- •7.3.2 Region Automaton
- •7.4 AVAILABLE TOOLS
- •7.5 HISTORICAL PERSPECTIVE AND RELATED WORK
- •7.6 SUMMARY
- •EXERCISES
- •TIMED PETRI NETS
- •8.1 UNTIMED PETRI NETS
- •8.2 PETRI NETS WITH TIME EXTENSIONS
- •8.2.1 Timed Petri Nets
- •8.2.2 Time Petri Nets
- •8.3 TIME ER NETS
- •8.3.1 Strong and Weak Time Models
- •8.5.1 Determining Fireability of Transitions from Classes
- •8.5.2 Deriving Reachable Classes
- •8.6 MILANO GROUP’S APPROACH TO HLTPN ANALYSIS
- •8.6.1 Facilitating Analysis with TRIO
- •8.7 PRACTICALITY: AVAILABLE TOOLS
- •8.8 HISTORICAL PERSPECTIVE AND RELATED WORK
- •8.9 SUMMARY
- •EXERCISES
- •PROCESS ALGEBRA
- •9.1 UNTIMED PROCESS ALGEBRAS
- •9.2 MILNER’S CALCULUS OF COMMUNICATING SYSTEMS
- •9.2.1 Direct Equivalence of Behavior Programs
- •9.2.2 Congruence of Behavior Programs
- •9.2.3 Equivalence Relations: Bisimulation
- •9.3 TIMED PROCESS ALGEBRAS
- •9.4 ALGEBRA OF COMMUNICATING SHARED RESOURCES
- •9.4.1 Syntax of ACSR
- •9.4.2 Semantics of ACSR: Operational Rules
- •9.4.3 Example Airport Radar System
- •9.5 ANALYSIS AND VERIFICATION
- •9.5.1 Analysis Example
- •9.5.2 Using VERSA
- •9.5.3 Practicality
- •9.6 RELATIONSHIPS TO OTHER APPROACHES
- •9.7 AVAILABLE TOOLS
- •9.8 HISTORICAL PERSPECTIVE AND RELATED WORK
- •9.9 SUMMARY
- •EXERCISES
- •10.3.1 The Declaration Section
- •10.3.2 The CONST Declaration
- •10.3.3 The VAR Declaration
- •10.3.4 The INPUTVAR Declaration
- •10.3.5 The Initialization Section INIT and INPUT
- •10.3.6 The RULES Section
- •10.3.7 The Output Section
- •10.5.1 Analysis Example
- •10.6 THE ANALYSIS PROBLEM
- •10.6.1 Finite Domains
- •10.6.2 Special Form: Compatible Assignment to Constants,
- •10.6.3 The General Analysis Strategy
- •10.8 THE SYNTHESIS PROBLEM
- •10.8.1 Time Complexity of Scheduling Equational
- •10.8.2 The Method of Lagrange Multipliers for Solving the
- •10.9 SPECIFYING TERMINATION CONDITIONS IN ESTELLA
- •10.9.1 Overview of the Analysis Methodology
- •10.9.2 Facility for Specifying Behavioral Constraint Assertions
- •10.10 TWO INDUSTRIAL EXAMPLES
- •10.10.2 Specifying Assertions for Analyzing the FCE Expert System
- •Meta Rules of the Fuel Cell Expert System
- •10.11.1 General Analysis Algorithm
- •10.11.2 Selecting Independent Rule Sets
- •10.11.3 Checking Compatibility Conditions
- •10.12 QUANTITATIVE TIMING ANALYSIS ALGORITHMS
- •10.12.1 Overview
- •10.12.2 The Equational Logic Language
- •10.12.3 Mutual Exclusiveness and Compatibility
- •10.12.5 Program Execution and Response Time
- •10.12.8 Special Form A and Algorithm A
- •10.12.9 Special Form A
- •10.12.10 Special Form D and Algorithm D
- •10.12.11 The General Analysis Algorithm
- •10.12.12 Proofs
- •10.13 HISTORICAL PERSPECTIVE AND RELATED WORK
- •10.14 SUMMARY
- •EXERCISES
- •11.1 THE OPS5 LANGUAGE
- •11.1.1 Overview
- •11.1.2 The Rete Network
- •11.2.1 Static Analysis of Control Paths in OPS5
- •11.2.2 Termination Analysis
- •11.2.3 Timing Analysis
- •11.2.4 Static Analysis
- •11.2.5 WM Generation
- •11.2.6 Implementation and Experiment
- •11.3.1 Introduction
- •11.3.3 Response Time of OPS5 Systems
- •11.3.4 List of Symbols
- •11.3.5 Experimental Results
- •11.3.6 Removing Cycles with the Help of the Programmer
- •11.4 HISTORICAL PERSPECTIVE AND RELATED WORK
- •11.5 SUMMARY
- •EXERCISES
- •12.1 INTRODUCTION
- •12.2 BACKGROUND
- •12.3 BASIC DEFINITIONS
- •12.3.1 EQL Program
- •12.3.4 Derivation of Fixed Points
- •12.4 OPTIMIZATION ALGORITHM
- •12.5 EXPERIMENTAL EVALUATION
- •12.6 COMMENTS ON OPTIMIZATION METHODS
- •12.6.1 Qualitative Comparison of Optimization Methods
- •12.7 HISTORICAL PERSPECTIVE AND RELATED WORK
- •12.8 SUMMARY
- •EXERCISES
- •BIBLIOGRAPHY
- •INDEX

CHAPTER 2
ANALYSIS AND VERIFICATION OF NON-REAL-TIME SYSTEMS
A great collection of techniques and tools are available for the reasoning, analysis, and verification of non-real-time systems. This chapter explores the basic foundations of these techniques that include symbolic logic, automata, formal languages, and state transition systems. Many analysis and verification techniques for real-time systems are based on these untimed approaches, as we will see in later chapters. Here, we give a condensed introduction to some of these untimed approaches without providing mathematically involved proofs, and describe their applications to untimed versions of several simple real-time systems.
2.1 SYMBOLIC LOGIC
Symbolic logic is a collection of languages that use symbols to represent facts, events, and actions, and provide rules to symbolize reasoning. Given the specification of a system and a collection of desirable properties, both written in logic formulas, we can attempt to prove that these desirable properties are logical consequences of the specification. In this section, we introduce the propositional logic
(also called propositional calculus, zero-order logic, digital logic, or Boolean logic, the most simple symbolic logic), the predicate logic (also called predicate calculus or first-order logic), and several proof techniques.
2.1.1 Propositional Logic
Using propositional logic, we can write declarative sentences called propositions that can be either true (denoted by T) or false (denoted by F) but not both. We use an uppercase letter or a string of uppercase letters to denote a proposition.
10
SYMBOLIC LOGIC |
11 |
Example
Pdenotes “car brake pedal is pressed”
Qdenotes “car stops within five seconds”
Rdenotes “car avoids a collision”
These symbols P, Q, and R, used to represent propositions, are called atomic formulas, or simply atoms. To express more complex propositions such as the following compound proposition, we use logical connectives such as → (if-then or imply):
“if car brake pedal is pressed, then car stops within five seconds.”
This compound proposition is expressed in propositional logic as:
P → Q
Similarly, the following statement
“if car stops within five seconds, then car avoids a collision”
is expressed as:
Q → R
Given these two propositions, we can easily show that P → R, that is,
“if car brake pedal is pressed, then car avoids a collision.”
We can combine propositions and logical connectives to form complicated formulas. A well-formed formula is either a proposition or a compound proposition formed accoding to the following rules.
Well-Formed Formulas: Well-formed formulas in propositional logic are defined recursively as follows:
1.An atom is a formula.
2.If F is a formula, then (¬F) is a formula, where ¬ is the not operator.
3.If F and G are formulas, then (F G), (F G), (F → G), and (F ↔ G) are formulas. ( is the and operator, is the or operator, ↔ stands for if and only if or iff.)
4.All formulas are generated using the above rules.
Some parentheses in a formula can be omitted for conciseness if there is no ambiguity.
12 ANALYSIS AND VERIFICATION OF NON-REAL-TIME SYSTEMS
P |
Q |
P → Q |
F |
F |
T |
|
|
|
F |
T |
T |
|
|
|
T |
F |
F |
|
|
|
T |
T |
T |
|
|
|
|
|
Figure 2.1 |
Truth table of P → R. |
|
||
|
||||||
P Q ¬P P Q P Q P → Q P ↔ Q |
||||||
F |
F |
T |
F |
F |
T |
T |
|
|
|
|
|
|
|
F |
T |
T |
T |
F |
T |
F |
|
|
|
|
|
|
|
T |
F |
F |
T |
F |
F |
F |
|
|
|
|
|
|
|
T |
T |
F |
T |
T |
T |
T |
|
|
|
|
|
|
|
Figure 2.2 Truth table for simple formulas.
Interpretation: An interpretation of a propositional formula G is an assignment of truth values to the atoms A1, . . . , An in G in which every Ai is assigned either T or F, but not both.
Then a formula G is said to be true in an interpretation iff G is evaluated to be true in the interpretation; otherwise, G is said to be false in the interpretation. A truth table displays the the truth values of a formula G for all possible interpretations of G. For a formula G with n distinct atoms, there will be 2n distinct interpretations for G. Figure 2.1 shows the truth table for P → R. Figure 2.2 shows the truth table for several simple formulas.
A formula is valid iff it is true under all its interpretations. A formula is invalid iff it is not valid. A formula is unsatisfiable (inconsistent) iff it is false under all its interpretations. A formula is satisfiable (consistent) iff it is not unsatisfiable.
A literal is an atomic formula or the negation of an atomic formula. A formula is in conjunctive normal form (CNF) if it is a conjunction of disjunction of literals and can be written as
( in=1( mj=i 1 Li, j ))
where n ≥ 1; m1, . . . , mn ≥ 1; and each Li, j is a literal.
A formula is in disjunctive normal form (DNF) if it is a disjunction of conjunction of literals and can be written as
( in=1( mj=i 1 Li, j ))
where n ≥ 1; m1, . . . , mn ≥ 1; and each Li, j is a literal. These two normal forms make it easier for proof procedures to manipulate and analyze logic formulas. Fig-
SYMBOLIC LOGIC |
13 |
Idempotency |
(P P) = P |
|
(P P) = P |
Implication |
P → Q = ¬P Q |
Commutativity |
(P Q) = (Q P) |
|
(P Q) = (Q P) |
|
(P ↔ Q) = (Q ↔ P) |
Associativity |
((P Q) R) = (P (Q R)) |
|
((P Q) R) = (P (Q R)) |
Absorption |
(P (P Q)) = P |
|
(P (P Q)) = P |
Distributivity (P (Q R)) = ((P Q) (P R)) |
|
|
(P (Q R)) = ((P Q) (P R)) |
Double Negation |
¬¬P = P |
DeMorgan |
¬(P Q) = (¬P ¬Q) |
|
¬(P Q) = (¬P ¬Q) |
Tautology |
(P Q) = P if P is a tautology (true) |
|
(P Q) = Q if P is a tautology (true) |
Unsatisfiability |
(P Q) = Q if P is unsatisfiable (false) |
|
(P Q) = P if P is unsatisfiable (false) |
Figure 2.3 Equivalent formulas.
ure 2.3 lists the laws stating which formulas are equivalent. These laws are useful for transforming and manipulating formulas.
To show that a statement logically follows from another statement, we first define the meaning of logical consequence. A formula G is a logical consequence of formulas F1, . . . , Fn (i.e., (F1 . . . Fn → G)) iff for every interpretation in which F1 . . . Fn is true, G is also true. Then (F1 . . . Fn → G) is a valid formula.
We can use the resolution principle to establish logical consequences and this principle can be stated as follows. First, we define a clause as a finite set, possibly empty, of literals. A clause can also be defined as a finite disjunction of zero or more literals. The empty clause is indicated by a . A clause set is a set of clauses. A unit clause contains one literal.
Resolution Principle: For any two clauses C1 and C2, if there is a literal L1 in C1 and there is a literal L2 in C2 such that L1 L2 is false, then the resolvent of C1 and C2 is the clause consisting of the disjunction of the remaining clauses in C1 and C2 after removing L1 and L2 from C1 and C2, respectively.
It can be easily proved that a resolvent of two clauses is a logical consequence of these two clauses.
14 ANALYSIS AND VERIFICATION OF NON-REAL-TIME SYSTEMS
Example. Suppose we have two clauses C1 and C2:
C1 : P
C2 : ¬Q R ¬S
Because literal Q in C1 and ¬Q in C2 are complementary (their conjunction is false), we remove these two literals from their respective clauses and construct the resolvent by forming the disjunction of the remaining clauses: P R ¬S.
The resolvent, if it exists, of two unit clauses is the empty clause . If a set S of clauses is unsatisfiable, then we can use the resolution principle to generate
from S.
Example. Consider the following simplified automatic climate control (air conditioning and heating system. The room temperature can be in one of the following three ranges:
comfortable: thermostat sensor detects the room temperature is within the comfort range, that is, between 68 and 78 degrees F.
hot: thermostat sensor detects the room temperature is above 78 degrees F. cold: thermostat sensor detects the room temperature is below 68 degrees F.
Let
H = the room temperature is hot
C = the room temperature is cold
M = the room temperature is comfortable
A = the air conditioner is on
G = the heater is on.
We now specify the climate control system in English. If the room temperature is hot, then the air conditioner is on. If the room temperature is cold, then the heater is on. If the room temperature is neither hot nor cold, then the room temperature is comfortable. Can we prove the following? If neither the air conditioner nor the heater is on, then the room temperature is comfortable.
This English specification of the climate control system and the requirement to be proved can be expressed in propositional logic formulas as follows.
F1 = H → A
F2 = C → G
F3 = ¬(H C) → M
Prove: F4 = ¬(A G) → M.
We first prove this proposition with the truth-table technique, shown in Figure 2.4. This technique exhaustively checks every interpretation of the formula F4 to de-
|
|
|
|
|
|
|
|
|
SYMBOLIC LOGIC |
15 |
|
|
|
|
|
|
|||||
H A C G M F1 |
F2 |
F3 |
F4 |
(F1 F2 F3) → F4 |
|
|||||
F |
F |
F |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
F |
F |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
F |
T |
F |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
F |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
F |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
F |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
F |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
T |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
T |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
T |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
F |
T |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
F |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
F |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
F |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
F |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
T |
F |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
T |
F |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
T |
T |
F |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
T |
T |
T |
T |
T |
T |
T |
F |
F |
T |
|
|
|
|
|
|
|
|
|
|
|
|
Figure 2.4 Truth table for proving F4.

16 ANALYSIS AND VERIFICATION OF NON-REAL-TIME SYSTEMS
termine if it evaluates to T. The truth table shows that every interpretation of F4 evaluates to T, thus F4 is valid.
Next we prove this proposition using the equivalency laws.
Prove: ¬(A G) → M. The premise is F1 F2 F3, which is
(H → A) (C → G) (¬(H C) → M)
= (¬H A) (¬C G) (¬¬(H C) M) (Implication)
= (¬H A) (¬C G) ((H C) M) (Double negation)
= A G M |
(Resolution twice) |
= (A G) M |
(Associativity) |
= ¬(A G) → M (Implication)
Therefore, we have shown that the following is valid: If neither the air conditioner nor the heater is on, then the room temperature is comfortable. However, we cannot conclude the following from the specification: If the room temperature is comfortable, then neither the air conditioner nor the heater is on, that is, M → ¬(A G).
Proving Satisfiability Using the Resolution Procedure Now we describe in detail the approach using the resolution principle to establish validity. Once a propositional formula is transformed into conjunctive normal form, the order of the subformulas joined by and can be changed without altering the meaning of the formula.
Two clause sets are equivalent if any truth-value assignment assigns the same truth value to both. Let S be a clause set. We define
R(S) = S {T : T is a resolvent of two clauses in S}.
The procedure using resolution to determine the satisfiability of individual propositional formulas consists of the steps shown in Figure 2.5.
This algorithm is an exhaustive approach to resolution since it forms all possible resolvents even though only a subset of these resolvents is needed to derive the empty
Resolution procedure:
(1)Transform the given formula into conjunctive normal form (CNF).
(2)Write this CNF formula in clausal form: a set S of clauses each of which is a disjunction of literals.
(3)Compute R(S), R2(S), . . . until Ri (S) = Ri+1(S) for some i.
(4)If Ri (S), then S is unsatisfiable; else S is satisfiable.
Figure 2.5 Resolution procedure for propositional logic.
SYMBOLIC LOGIC |
17 |
clause. Hence, its complexity is exponential in the size of the original size of the clause set S. To attempt to form only the needed resolvents, we define the concept of deduction. Given a clause set S, a deduction from S consists of a sequence of clauses C1, . . . , Cn where either each Ci S or for some a, b < i, Ci is a resolvent of Ca and Cb.
Resolution Theorem: A clause set S is unsatisfiable iff there is a deduction of the empty clause from S.
Example. Consider again the simplified automatic climate control example. Now we prove (F1 F2 F3) → F4 using this resolution theorem. We show that the negation of this formula is unsatisfiable. The negated formula is
¬((F1 F2 F3) → F4)
=¬(¬(F1 F2 F3) F4)
=(F1 F2 F3) ¬F4.
Replacing F1, F2, F3, F4 with the original symbols, we convert this formula into CNF:
(¬H A) (¬C G) (H C M) ¬A ¬G ¬M.
Then we convert this CNF formula into clausal form:
S = {{¬H, A}, {¬C, G}, {H, C, M}, {¬A}, {¬G}, {¬M}}.
We are ready to derive a deduction of from S:
C1 = {¬H, A} member of S
C2 = {¬C, G} member of S
C3 = {H, C, M} member of S
C4 = {¬A} member of S
C5 = {¬G} member of S
C6 = {¬M} member of S
C7 = {¬H } resolvent of C1 and C2
C8 = {¬C} resolvent of C3 and C4
C9 = {H, C} resolvent of C5 and C6
C10 = {H } resolvent of C8 and C9
C11 = resolvent of C7 and C10.