I. properties of functions (4.1) A. equality ------------------------------------------ REASONING ABOUT EQUALITY (4.1) Equivalence rules: _________ (== reflexive) |- t == t Phi |- t == t' Phi' |- t' == t'' __________________________(== transitive) Phi \union Phi' |- t == t'' Phi |- t == t' ______________ (== symmetric) Phi |- t' == t ------------------------------------------ what's the type of the t, t', and t'' metavariables? How do the assumptions work in the transitive rule? should there be some assumptions used in the reflexivity axiom? suppose we are given that |- Charity == Love and |- God == Love, what is a deduction that proves that Charity == God? B. substitution and congruence 1. substitution ------------------------------------------ SUBSTITUTION Phi' |- t1 == t1' Phi |- t[v := t1] ___________________________ (substitution) Phi \union Phi' |- t[v := t1'] * if t1 and t1' are free for v in t Phi |- t1 == t1' (substitute equals ______________________________ for equals) Phi |- t[v := t1] = t[v := t1'] * if t1 and t1' are free for v in t ------------------------------------------ What's the meaning of "t1 is free for v in t"? What's the meaning of their use of double quotes in derivations where the substitution rule is used? 2. congruence ------------------------------------------ FOR YOU TO DO Derive the following rules from the substitution rule: Phi |- t' == t'' ____________________(operand congruence) Phi |- t.t' == t.t'' Phi |- t' == t'' ___________________(operator congruence) Phi |- t'.t == t''.t ------------------------------------------ 3. abstraction ------------------------------------------ ABSTRACTION Phi |- t == t' __________________________ (abstraction) Phi |- (\ v . t) == (\ v . t') * if v is not free in Phi Example: Arith |- 0 + 1 == 1 __________________________ (abstraction) Arith |- (\ v . 0 + 1) == (\v . 1) Capture (wrong): Arith |- z + 1 == 1 __________________________________ Arith |- (\ z . z + 1) == (\z . 1) ------------------------------------------ what's wrong with the capture example? C. conversion rules ------------------------------------------ CONVERSION RULES _____________________ (alpha conversion) |- (\ v . t) == (\ w . t[v := w]) * w does not occur free in t ______________________ (beta conversion) |- (\ v . t).t' == t[v := t'] * t' is free for v in t ______________________ (eta conversion) |- (\ v . t.v) == t * v is not free in t Phi |- t.v == t'.v __________________ (extensionality) Phi |- t == t' * v is not free in Phi, t, or t' ------------------------------------------ What does "t' is free for v in t" mean? Why all of these side conditions? II. derivations (4.2) what kind of practical problems might you how to if you try to use the derivation formats for reasonably sized realistic proofs? ------------------------------------------ BASIC CALCULATIONAL PROOF FORMAT Translate D1 D2 ... Dm _________________ {rule name} Phi |- t == t' into: Phi |- t == { rule name } * [[D1]] * [[D2]] ... * [[Dm]] . t' where [[Di]] is the translation of Di ------------------------------------------ ------------------------------------------ USING TRANSITIVITY Translate DD1 DDm ______________ {R1} ... ____________{Rm} Phi |- t0 == t1 Phi |- tm-1 == tm ___________________________ {transitivity} Phi |- t0 == tm into: Phi |- t0 == { R1 } [[D1]] . t1 == { R2 } [[D2]] ... . t2 ... . tm-1 == { Rm } [[Dm]] . tm where [[Di]] is the translation of Di ------------------------------------------ what savings are achieved by this proof format? How would you prove from arithmetic that (\ x . 2*x) == (\ y . y + y)? III. definitions (4.3) Why are we worried about consistency when extending a theory? What's a conservative extension? A. constant definition ------------------------------------------ CONSTANT DEFINITION Format: ^ c = t (name) where t is a closed term Effect: - adds new constant c to SG - adds new axiom |- c = t to rules Example: ^ (backward o = (\ f g x . f.(g.x)) function composition) Shorthand for functions: ^ id.x = x (identity) ^ (f * g).x = g.(f.x) (forward composition) ^ (f o g).x = f.(g.x) (backward composition) ------------------------------------------ how is this like Haskell? how are the types of constant definitions figured out? B. abbreviations (syntactic sugars) 1. local definitions ------------------------------------------ LET ^ (let v = t' in t) = (\ v . t).t' Derived inference rules: ________________________ (vacuous let) |- (let v = t' in t) = t * if v is not free in t ___________________________ (let over \) |- (let v = t' in (\ y . t)) = (\ y . let v = t' in t) * if y is not free in t' ------------------------------------------ Can you simplify let x = 3 in x + x ? IV. product types (4.4) ------------------------------------------ ASCII NOTATION ``x'' means product (latex's \times) ------------------------------------------ ------------------------------------------ PAIRS Pair Types: S1 x S2 Associativity and precedence: S1 x S2 x S3 means S1 x (S2 x S3) S1 x S2 -> S3 means (S1 x S2) -> S3 S1 -> S2 x S3 means S1 -> (S2 x S3) Constants: pair: S -> T -> S x T fst: S x T -> S snd: S x T -> T we write pair.t.t' as (t,t') (t1,t2,t3) means (t1,(t2,t3)) Postulates: |- (fst.x, snd.x) == x (pairing) |- fst(x,y) == x (projection 1) |- snd(x,y) == y (projection 2) |- (x,y) == (x',y') (congruence) <==> x == x' /\ y == y' Abbreviation: ^ (\ u:S, v:T . t) == (paired abstraction) (\ w:S x T . t[u,v := fst.w, snd.w]) ------------------------------------------ Do we need to worry about the associativity of product types and pairing? How would you generalize the alpha, beta, and eta rules to paired abstractions? How would you simplify (\ x, y . x + y).(3,4) ?