doc-src/IsarAdvanced/Functions/Thy/Functions.thy
author krauss
Mon, 15 Jan 2007 10:15:55 +0100
changeset 22065 cdd077905eee
parent 21346 c8aa120fa05d
child 23003 4b0bf04a4d68
permissions -rw-r--r--
added sections on mutual induction and patterns
krauss@21212
     1
(*  Title:      Doc/IsarAdvanced/Functions/Thy/Fundefs.thy
krauss@21212
     2
    ID:         $Id$
krauss@21212
     3
    Author:     Alexander Krauss, TU Muenchen
krauss@21212
     4
krauss@21212
     5
Tutorial for function definitions with the new "function" package.
krauss@21212
     6
*)
krauss@21212
     7
krauss@21212
     8
theory Functions
krauss@21212
     9
imports Main
krauss@21212
    10
begin
krauss@21212
    11
krauss@21212
    12
chapter {* Defining Recursive Functions in Isabelle/HOL *}
krauss@21212
    13
krauss@21212
    14
section {* Function Definition for Dummies *}
krauss@21212
    15
krauss@21212
    16
text {*
krauss@21212
    17
  In most cases, defining a recursive function is just as simple as other definitions:
krauss@21212
    18
  *}
krauss@21212
    19
krauss@21212
    20
fun fib :: "nat \<Rightarrow> nat"
krauss@21212
    21
where
krauss@21212
    22
  "fib 0 = 1"
krauss@21212
    23
| "fib (Suc 0) = 1"
krauss@21212
    24
| "fib (Suc (Suc n)) = fib n + fib (Suc n)"
krauss@21212
    25
krauss@21212
    26
text {*
krauss@21212
    27
  The function always terminates, since the argument of gets smaller in every
krauss@21212
    28
  recursive call. Termination is an
krauss@21212
    29
  important requirement, since it prevents inconsistencies: From
krauss@21212
    30
  the "definition" @{text "f(n) = f(n) + 1"} we could prove 
krauss@21212
    31
  @{text "0  = 1"} by subtracting @{text "f(n)"} on both sides.
krauss@21212
    32
krauss@21212
    33
  Isabelle tries to prove termination automatically when a function is
krauss@21212
    34
  defined. We will later look at cases where this fails and see what to
krauss@21212
    35
  do then.
krauss@21212
    36
*}
krauss@21212
    37
krauss@21212
    38
subsection {* Pattern matching *}
krauss@21212
    39
krauss@22065
    40
text {* \label{patmatch}
krauss@21212
    41
  Like in functional programming, functions can be defined by pattern
krauss@21212
    42
  matching. At the moment we will only consider \emph{datatype
krauss@21212
    43
  patterns}, which only consist of datatype constructors and
krauss@21212
    44
  variables.
krauss@21212
    45
krauss@21212
    46
  If patterns overlap, the order of the equations is taken into
krauss@21212
    47
  account. The following function inserts a fixed element between any
krauss@21212
    48
  two elements of a list:
krauss@21212
    49
*}
krauss@21212
    50
krauss@21212
    51
fun sep :: "'a \<Rightarrow> 'a list \<Rightarrow> 'a list"
krauss@21212
    52
where
krauss@21212
    53
  "sep a (x#y#xs) = x # a # sep a (y # xs)"
krauss@21212
    54
| "sep a xs       = xs"
krauss@21212
    55
krauss@21212
    56
text {* 
krauss@21212
    57
  Overlapping patterns are interpreted as "increments" to what is
krauss@21212
    58
  already there: The second equation is only meant for the cases where
krauss@21212
    59
  the first one does not match. Consequently, Isabelle replaces it
krauss@22065
    60
  internally by the remaining cases, making the patterns disjoint:
krauss@21212
    61
*}
krauss@21212
    62
krauss@22065
    63
thm sep.simps
krauss@21212
    64
krauss@22065
    65
text {* @{thm [display] sep.simps[no_vars]} *}
krauss@21212
    66
krauss@21212
    67
text {* 
krauss@21212
    68
  The equations from function definitions are automatically used in
krauss@21212
    69
  simplification:
krauss@21212
    70
*}
krauss@21212
    71
krauss@22065
    72
lemma "sep (0::nat) [1, 2, 3] = [1, 0, 2, 0, 3]"
krauss@21212
    73
by simp
krauss@21212
    74
krauss@21212
    75
  
krauss@21212
    76
krauss@21212
    77
subsection {* Induction *}
krauss@21212
    78
krauss@22065
    79
text {*
krauss@21212
    80
krauss@22065
    81
  Isabelle provides customized induction rules for recursive functions.  
krauss@22065
    82
  See \cite[\S3.5.4]{isa-tutorial}.
krauss@22065
    83
*}
krauss@21212
    84
krauss@22065
    85
krauss@22065
    86
section {* Full form definitions *}
krauss@21212
    87
krauss@21212
    88
text {* 
krauss@21212
    89
  Up to now, we were using the \cmd{fun} command, which provides a
krauss@21212
    90
  convenient shorthand notation for simple function definitions. In
krauss@21212
    91
  this mode, Isabelle tries to solve all the necessary proof obligations
krauss@21212
    92
  automatically. If a proof does not go through, the definition is
krauss@22065
    93
  rejected. This can either mean that the definition is indeed faulty,
krauss@22065
    94
  or that the default proof procedures are just not smart enough (or
krauss@22065
    95
  rather: not designed) to handle the definition.
krauss@21212
    96
krauss@22065
    97
  By expanding the abbreviated \cmd{fun} to the full \cmd{function}
krauss@22065
    98
  command, the proof obligations become visible and can be analyzed or
krauss@22065
    99
  solved manually.
krauss@21212
   100
krauss@22065
   101
\end{isamarkuptext}
krauss@21212
   102
krauss@21212
   103
krauss@22065
   104
\fbox{\parbox{\textwidth}{
krauss@22065
   105
\noindent\cmd{fun} @{text "f :: \<tau>"}\\%
krauss@22065
   106
\cmd{where}\isanewline%
krauss@22065
   107
\ \ {\it equations}\isanewline%
krauss@22065
   108
\ \ \quad\vdots
krauss@22065
   109
}}
krauss@22065
   110
krauss@22065
   111
\begin{isamarkuptext}
krauss@22065
   112
\vspace*{1em}
krauss@22065
   113
\noindent abbreviates
krauss@22065
   114
\end{isamarkuptext}
krauss@22065
   115
krauss@22065
   116
\fbox{\parbox{\textwidth}{
krauss@22065
   117
\noindent\cmd{function} @{text "("}\cmd{sequential}@{text ") f :: \<tau>"}\\%
krauss@22065
   118
\cmd{where}\isanewline%
krauss@22065
   119
\ \ {\it equations}\isanewline%
krauss@22065
   120
\ \ \quad\vdots\\%
krauss@22065
   121
\cmd{by} @{text "pat_completeness auto"}\\%
krauss@22065
   122
\cmd{termination by} @{text "lexicographic_order"}
krauss@22065
   123
}}
krauss@22065
   124
krauss@22065
   125
\begin{isamarkuptext}
krauss@22065
   126
  \vspace*{1em}
krauss@22065
   127
  \noindent Some declarations and proofs have now become explicit:
krauss@21212
   128
krauss@21212
   129
  \begin{enumerate}
krauss@22065
   130
  \item The \cmd{sequential} option enables the preprocessing of
krauss@21212
   131
  pattern overlaps we already saw. Without this option, the equations
krauss@21212
   132
  must already be disjoint and complete. The automatic completion only
krauss@21212
   133
  works with datatype patterns.
krauss@21212
   134
krauss@21212
   135
  \item A function definition now produces a proof obligation which
krauss@21212
   136
  expresses completeness and compatibility of patterns (We talk about
krauss@22065
   137
  this later). The combination of the methods @{text "pat_completeness"} and
krauss@22065
   138
  @{text "auto"} is used to solve this proof obligation.
krauss@21212
   139
krauss@21212
   140
  \item A termination proof follows the definition, started by the
krauss@22065
   141
  \cmd{termination} command, which sets up the goal. The @{text
krauss@22065
   142
  "lexicographic_order"} method can prove termination of a certain
krauss@22065
   143
  class of functions by searching for a suitable lexicographic
krauss@22065
   144
  combination of size measures.
krauss@22065
   145
 \end{enumerate}
krauss@22065
   146
  Whenever a \cmd{fun} command fails, it is usually a good idea to
krauss@22065
   147
  expand the syntax to the more verbose \cmd{function} form, to see
krauss@22065
   148
  what is actually going on.
krauss@21212
   149
 *}
krauss@21212
   150
krauss@21212
   151
krauss@21212
   152
section {* Proving termination *}
krauss@21212
   153
krauss@21212
   154
text {*
krauss@21212
   155
  Consider the following function, which sums up natural numbers up to
krauss@22065
   156
  @{text "N"}, using a counter @{text "i"}:
krauss@21212
   157
*}
krauss@21212
   158
krauss@21212
   159
function sum :: "nat \<Rightarrow> nat \<Rightarrow> nat"
krauss@21212
   160
where
krauss@21212
   161
  "sum i N = (if i > N then 0 else i + sum (Suc i) N)"
krauss@22065
   162
by pat_completeness auto
krauss@21212
   163
krauss@21212
   164
text {*
krauss@22065
   165
  \noindent The @{text "lexicographic_order"} method fails on this example, because none of the
krauss@21212
   166
  arguments decreases in the recursive call.
krauss@21212
   167
krauss@21212
   168
  A more general method for termination proofs is to supply a wellfounded
krauss@21212
   169
  relation on the argument type, and to show that the argument
krauss@21212
   170
  decreases in every recursive call. 
krauss@21212
   171
krauss@21212
   172
  The termination argument for @{text "sum"} is based on the fact that
krauss@21212
   173
  the \emph{difference} between @{text "i"} and @{text "N"} gets
krauss@21212
   174
  smaller in every step, and that the recursion stops when @{text "i"}
krauss@21212
   175
  is greater then @{text "n"}. Phrased differently, the expression 
krauss@21212
   176
  @{text "N + 1 - i"} decreases in every recursive call.
krauss@21212
   177
krauss@22065
   178
  We can use this expression as a measure function suitable to prove termination.
krauss@21212
   179
*}
krauss@21212
   180
krauss@21212
   181
termination 
krauss@22065
   182
by (relation "measure (\<lambda>(i,N). N + 1 - i)") auto
krauss@21212
   183
krauss@21212
   184
text {*
krauss@22065
   185
  The @{text relation} method takes a relation of
krauss@22065
   186
  type @{typ "('a \<times> 'a) set"}, where @{typ "'a"} is the argument type of
krauss@22065
   187
  the function. If the function has multiple curried arguments, then
krauss@22065
   188
  these are packed together into a tuple, as it happened in the above
krauss@22065
   189
  example.
krauss@21212
   190
krauss@22065
   191
  The predefined function @{term_type "measure"} is a very common way of
krauss@22065
   192
  specifying termination relations in terms of a mapping into the
krauss@22065
   193
  natural numbers.
krauss@22065
   194
krauss@22065
   195
  After the invocation of @{text "relation"}, we must prove that (a)
krauss@22065
   196
  the relation we supplied is wellfounded, and (b) that the arguments
krauss@22065
   197
  of recursive calls indeed decrease with respect to the
krauss@22065
   198
  relation. These goals are all solved by the subsequent call to
krauss@22065
   199
  @{text "auto"}.
krauss@22065
   200
krauss@22065
   201
  Let us complicate the function a little, by adding some more
krauss@22065
   202
  recursive calls: 
krauss@21212
   203
*}
krauss@21212
   204
krauss@21212
   205
function foo :: "nat \<Rightarrow> nat \<Rightarrow> nat"
krauss@21212
   206
where
krauss@21212
   207
  "foo i N = (if i > N 
krauss@21212
   208
              then (if N = 0 then 0 else foo 0 (N - 1))
krauss@21212
   209
              else i + foo (Suc i) N)"
krauss@21212
   210
by pat_completeness auto
krauss@21212
   211
krauss@21212
   212
text {*
krauss@21212
   213
  When @{text "i"} has reached @{text "N"}, it starts at zero again
krauss@21212
   214
  and @{text "N"} is decremented.
krauss@21212
   215
  This corresponds to a nested
krauss@21212
   216
  loop where one index counts up and the other down. Termination can
krauss@21212
   217
  be proved using a lexicographic combination of two measures, namely
krauss@22065
   218
  the value of @{text "N"} and the above difference. The @{const
krauss@22065
   219
  "measures"} combinator generalizes @{text "measure"} by taking a
krauss@22065
   220
  list of measure functions.  
krauss@21212
   221
*}
krauss@21212
   222
krauss@21212
   223
termination 
krauss@22065
   224
by (relation "measures [\<lambda>(i, N). N, \<lambda>(i,N). N + 1 - i]") auto
krauss@21212
   225
krauss@21212
   226
krauss@21212
   227
section {* Mutual Recursion *}
krauss@21212
   228
krauss@21212
   229
text {*
krauss@21212
   230
  If two or more functions call one another mutually, they have to be defined
krauss@21212
   231
  in one step. The simplest example are probably @{text "even"} and @{text "odd"}:
krauss@21212
   232
*}
krauss@21212
   233
krauss@22065
   234
function even :: "nat \<Rightarrow> bool"
krauss@22065
   235
    and odd  :: "nat \<Rightarrow> bool"
krauss@21212
   236
where
krauss@21212
   237
  "even 0 = True"
krauss@21212
   238
| "odd 0 = False"
krauss@21212
   239
| "even (Suc n) = odd n"
krauss@21212
   240
| "odd (Suc n) = even n"
krauss@22065
   241
by pat_completeness auto
krauss@21212
   242
krauss@21212
   243
text {*
krauss@21212
   244
  To solve the problem of mutual dependencies, Isabelle internally
krauss@21212
   245
  creates a single function operating on the sum
krauss@21212
   246
  type. Then the original functions are defined as
krauss@21212
   247
  projections. Consequently, termination has to be proved
krauss@21212
   248
  simultaneously for both functions, by specifying a measure on the
krauss@21212
   249
  sum type: 
krauss@21212
   250
*}
krauss@21212
   251
krauss@21212
   252
termination 
krauss@22065
   253
by (relation "measure (\<lambda>x. case x of Inl n \<Rightarrow> n | Inr n \<Rightarrow> n)") 
krauss@22065
   254
   auto
krauss@21212
   255
krauss@22065
   256
subsection {* Induction for mutual recursion *}
krauss@21212
   257
krauss@22065
   258
text {*
krauss@21212
   259
krauss@22065
   260
  When functions are mutually recursive, proving properties about them
krauss@22065
   261
  generally requires simultaneous induction. The induction rules
krauss@22065
   262
  generated from the definitions reflect this.
krauss@21212
   263
krauss@22065
   264
  Let us prove something about @{const even} and @{const odd}:
krauss@22065
   265
*}
krauss@21212
   266
krauss@22065
   267
lemma 
krauss@22065
   268
  "even n = (n mod 2 = 0)"
krauss@22065
   269
  "odd n = (n mod 2 = 1)"
krauss@21212
   270
krauss@22065
   271
txt {* 
krauss@22065
   272
  We apply simultaneous induction, specifying the induction variable
krauss@22065
   273
  for both goals, separated by \cmd{and}:  *}
krauss@22065
   274
krauss@22065
   275
apply (induct n and n rule: even_odd.induct)
krauss@22065
   276
krauss@22065
   277
txt {* 
krauss@22065
   278
  We get four subgoals, which correspond to the clauses in the
krauss@22065
   279
  definition of @{const even} and @{const odd}:
krauss@22065
   280
  @{subgoals[display,indent=0]}
krauss@22065
   281
  Simplification solves the first two goals, leaving us with two
krauss@22065
   282
  statements about the @{text "mod"} operation to prove:
krauss@22065
   283
*}
krauss@22065
   284
krauss@22065
   285
apply simp_all
krauss@22065
   286
krauss@22065
   287
txt {* 
krauss@22065
   288
  @{subgoals[display,indent=0]} 
krauss@22065
   289
krauss@22065
   290
  \noindent These can be handeled by the descision procedure for
krauss@22065
   291
  presburger arithmethic.
krauss@22065
   292
  
krauss@22065
   293
*}
krauss@22065
   294
krauss@22065
   295
apply presburger
krauss@22065
   296
apply presburger
krauss@22065
   297
done
krauss@22065
   298
krauss@22065
   299
text {*
krauss@22065
   300
  Even if we were just interested in one of the statements proved by
krauss@22065
   301
  simultaneous induction, the other ones may be necessary to
krauss@22065
   302
  strengthen the induction hypothesis. If we had left out the statement
krauss@22065
   303
  about @{const odd} (by substituting it with @{term "True"}, our
krauss@22065
   304
  proof would have failed:
krauss@22065
   305
*}
krauss@22065
   306
krauss@22065
   307
lemma 
krauss@22065
   308
  "even n = (n mod 2 = 0)"
krauss@22065
   309
  "True"
krauss@22065
   310
apply (induct n rule: even_odd.induct)
krauss@22065
   311
krauss@22065
   312
txt {*
krauss@22065
   313
  \noindent Now the third subgoal is a dead end, since we have no
krauss@22065
   314
  useful induction hypothesis:
krauss@22065
   315
krauss@22065
   316
  @{subgoals[display,indent=0]} 
krauss@22065
   317
*}
krauss@22065
   318
krauss@22065
   319
oops
krauss@22065
   320
krauss@22065
   321
section {* More general patterns *}
krauss@22065
   322
krauss@22065
   323
subsection {* Avoiding pattern splitting *}
krauss@22065
   324
krauss@22065
   325
text {*
krauss@22065
   326
krauss@22065
   327
  Up to now, we used pattern matching only on datatypes, and the
krauss@22065
   328
  patterns were always disjoint and complete, and if they weren't,
krauss@22065
   329
  they were made disjoint automatically like in the definition of
krauss@22065
   330
  @{const "sep"} in \S\ref{patmatch}.
krauss@22065
   331
krauss@22065
   332
  This splitting can significantly increase the number of equations
krauss@22065
   333
  involved, and is not always necessary. The following simple example
krauss@22065
   334
  shows the problem:
krauss@22065
   335
  
krauss@22065
   336
  Suppose we are modelling incomplete knowledge about the world by a
krauss@22065
   337
  three-valued datatype, which has values for @{term "T"}, @{term "F"}
krauss@22065
   338
  and @{term "X"} for true, false and uncertain propositions. 
krauss@22065
   339
*}
krauss@22065
   340
krauss@22065
   341
datatype P3 = T | F | X
krauss@22065
   342
krauss@22065
   343
text {* Then the conjunction of such values can be defined as follows: *}
krauss@22065
   344
krauss@22065
   345
fun And :: "P3 \<Rightarrow> P3 \<Rightarrow> P3"
krauss@22065
   346
where
krauss@22065
   347
  "And T p = p"
krauss@22065
   348
  "And p T = p"
krauss@22065
   349
  "And p F = F"
krauss@22065
   350
  "And F p = F"
krauss@22065
   351
  "And X X = X"
krauss@22065
   352
krauss@22065
   353
krauss@22065
   354
text {* 
krauss@22065
   355
  This definition is useful, because the equations can directly be used
krauss@22065
   356
  as rules to simplify expressions. But the patterns overlap, e.g.~the
krauss@22065
   357
  expression @{term "And T T"} is matched by the first two
krauss@22065
   358
  equations. By default, Isabelle makes the patterns disjoint by
krauss@22065
   359
  splitting them up, producing instances:
krauss@22065
   360
*}
krauss@22065
   361
krauss@22065
   362
thm And.simps
krauss@22065
   363
krauss@22065
   364
text {*
krauss@22065
   365
  @{thm[indent=4] And.simps}
krauss@22065
   366
  
krauss@22065
   367
  \vspace*{1em}
krauss@22065
   368
  \noindent There are several problems with this approach:
krauss@22065
   369
krauss@22065
   370
  \begin{enumerate}
krauss@22065
   371
  \item When datatypes have many constructors, there can be an
krauss@22065
   372
  explosion of equations. For @{const "And"}, we get seven instead of
krauss@22065
   373
  five equation, which can be tolerated, but this is just a small
krauss@22065
   374
  example.
krauss@22065
   375
krauss@22065
   376
  \item Since splitting makes the equations "more special", they
krauss@22065
   377
  do not always match in rewriting. While the term @{term "And x F"}
krauss@22065
   378
  can be simplified to @{term "F"} by the original specification, a
krauss@22065
   379
  (manual) case split on @{term "x"} is now necessary.
krauss@22065
   380
krauss@22065
   381
  \item The splitting also concerns the induction rule @{text
krauss@22065
   382
  "And.induct"}. Instead of five premises it now has seven, which
krauss@22065
   383
  means that our induction proofs will have more cases.
krauss@22065
   384
krauss@22065
   385
  \item In general, it increases clarity if we get the same definition
krauss@22065
   386
  back which we put in.
krauss@22065
   387
  \end{enumerate}
krauss@22065
   388
krauss@22065
   389
  On the other hand, a definition needs to be consistent and defining
krauss@22065
   390
  both @{term "f x = True"} and @{term "f x = False"} is a bad
krauss@22065
   391
  idea. So if we don't want Isabelle to mangle our definitions, we
krauss@22065
   392
  will have to prove that this is not necessary. By using the full
krauss@22065
   393
  definition form withour the \cmd{sequential} option, we get this
krauss@22065
   394
  behaviour: 
krauss@22065
   395
*}
krauss@22065
   396
krauss@22065
   397
function And2 :: "P3 \<Rightarrow> P3 \<Rightarrow> P3"
krauss@22065
   398
where
krauss@22065
   399
  "And2 T p = p"
krauss@22065
   400
  "And2 p T = p"
krauss@22065
   401
  "And2 p F = F"
krauss@22065
   402
  "And2 F p = F"
krauss@22065
   403
  "And2 X X = X"
krauss@22065
   404
krauss@22065
   405
txt {*
krauss@22065
   406
  Now it is also time to look at the subgoals generated by a
krauss@22065
   407
  function definition. In this case, they are:
krauss@22065
   408
krauss@22065
   409
  @{subgoals[display,indent=0]} 
krauss@22065
   410
krauss@22065
   411
  The first subgoal expresses the completeness of the patterns. It has
krauss@22065
   412
  the form of an elimination rule and states that every @{term x} of
krauss@22065
   413
  the function's input type must match one of the patterns. It could
krauss@22065
   414
  be equivalently stated as a disjunction of existential statements: 
krauss@22065
   415
@{term "(\<exists>p. x = (T, p)) \<or> (\<exists>p. x = (p, T)) \<or> (\<exists>p. x = (p, F)) \<or>
krauss@22065
   416
  (\<exists>p. x = (F, p)) \<or> (x = (X, X))"} If the patterns just involve
krauss@22065
   417
  datatypes, we can solve it with the @{text "pat_completeness"} method:
krauss@22065
   418
*}
krauss@22065
   419
krauss@22065
   420
apply pat_completeness
krauss@22065
   421
krauss@22065
   422
txt {*
krauss@22065
   423
  The remaining subgoals express \emph{pattern compatibility}. We do
krauss@22065
   424
  allow that a value is matched by more than one patterns, but in this
krauss@22065
   425
  case, the result (i.e.~the right hand sides of the equations) must
krauss@22065
   426
  also be equal. For each pair of two patterns, there is one such
krauss@22065
   427
  subgoal. Usually this needs injectivity of the constructors, which
krauss@22065
   428
  is used automatically by @{text "auto"}.
krauss@22065
   429
*}
krauss@22065
   430
krauss@22065
   431
by auto
krauss@22065
   432
krauss@22065
   433
krauss@22065
   434
subsection {* Non-constructor patterns *}
krauss@21212
   435
krauss@21212
   436
text {* FIXME *}
krauss@21212
   437
krauss@22065
   438
krauss@22065
   439
subsection {* Non-constructor patterns *}
krauss@21212
   440
krauss@21212
   441
text {* FIXME *}
krauss@21212
   442
krauss@21212
   443
krauss@21212
   444
krauss@21212
   445
krauss@22065
   446
krauss@22065
   447
krauss@22065
   448
section {* Partiality *}
krauss@22065
   449
krauss@22065
   450
text {* 
krauss@22065
   451
  In HOL, all functions are total. A function @{term "f"} applied to
krauss@22065
   452
  @{term "x"} always has a value @{term "f x"}, and there is no notion
krauss@22065
   453
  of undefinedness. 
krauss@22065
   454
krauss@22065
   455
  FIXME
krauss@22065
   456
*}
krauss@22065
   457
krauss@22065
   458
krauss@22065
   459
krauss@22065
   460
  
krauss@22065
   461
section {* Nested recursion *}
krauss@22065
   462
krauss@22065
   463
text {*
krauss@22065
   464
  
krauss@22065
   465
krauss@22065
   466
krauss@22065
   467
krauss@22065
   468
krauss@22065
   469
krauss@22065
   470
 FIXME *}
krauss@22065
   471
krauss@22065
   472
krauss@22065
   473
krauss@22065
   474
krauss@22065
   475
krauss@22065
   476
krauss@21212
   477
end