> But all software produced is fault. This has been proven. Well, that's not always been true :-( It comes back to what I was saying about the concept of provability last week. It _is_ possible to "prove" the correctness of a piece of code against a formal specification: - there's a whole branch of mathematics (denotational semantics) devoted to exactly this - there are a couple of branches of mathematics (domain theory, lattice theory) devoted to manipulating the mathematical structures required to do this This is a bit esoteric, to say the least - the "introduction to domain theory" and "introduction to lattice theory" courses are typically post-graduate options. HOWEVER, and this is important, the provability of correctness of an application depends, in turn on proof of correctness of the compiler, the underlying O/S, and indeed, the hardware set on which the application runs. To the best of my knowledge, the only commercial company that ever went this into a big way was INMOS, and the only chipset that was ever proved valid was the T8000 transputer... as you might imagine from this, the only language that ever was proved for this chipset was INMOS' own OCCAM compiler. Thus ended the 1980s.... ... in the 1990s , "buggy, but cheap" attracted far more buyers. Remember the Pentium floating-point bug? At the time, I was working for an engineering firm, that NEEDED that precision when designing the aforementioned bridges etc. INMOS collapsed, was bought out by one of the big French state-subsidised IT houses, and the transputer slipped into a footnote :-( Regards, Mark