next up previous
Next: 4 References Up: Automated Analysis of Software Previous: Analysis of Software Frameworks

3 Comparison

The past decades of research on formal methods, modular systems and component reuse has set the stage for serious consideration of how these areas might best be integrated.

There is a wealth of research on program proving which served as the foundation for work on reasoning about data abstractions with respect to their specifications, for example the LARCH [GH93] and RESOLVE [SW94] efforts. These approaches have carried on in the tradition of using rich specification formalisms to give complete specifications of abstraction behavior. They, necessarily, use heavy-weight manual or automated approaches to reason about the conformance of implementations to those specifications. Both make the crucial observation that scalable formal reasoning is only feasible if it can be done in a local and modular fashion.

There has been a significant amount of research on techniques for reasoning about specifications of complete systems, e.g., [CGL94,McM93,YTL+95,DBDS93]. There has been relatively little work on applying those analysis techniques to incompletely specified systems, e.g., parameterized frameworks and component-based systems, or in applying them in a way that achieves local certifiability [WH94].

The need for scalability of analyses and for analyses to integrate with practical development methods has shaped compiler analysis research to consider partial systems, for example as inter-procedural flow analyses. Those same needs are beginning to shape the more general analyses targeted at software verification and validation.

Work on compositional analysis, e.g., [YY91,CA94,CK95,CK96], has, of necessity, dealt with the question of analyzing sub-systems. These techniques have the luxury of embedding such sub-system analyses in the context of a whole program analysis. In this setting specification of the behavior expected at a sub-systems interface can be automatically generated from the sub-system itself, e.g., by a process of hiding and partition-refinement. The work outlined in this paper, requires explicit specifications to be constructed. Consequently issues related to appropriateness, usability and exploitability of the specification formalisms come into play.

Some recent work has begun to look at issues relevant to the specification and analysis of component-based systems. Avrunin et. al. [ACD97] consider the analysis of systems that are defined in a mixture of implementation and specification languages. This approach allows the analysis of concurrent program components implemented in Ada along with missing system components specified with regular expressions or GIL [DKM+94] formulae. Two specification languages are used to address different issues in the specification of system and environment behavior; regular expressions capture ordering relationships and GIL formulae capture timing information. Mixed-formalism system descriptions are common in the early stages of all software development. They are particularly relevant to software frameworks, since we can never anticipate all of the potential parameters and usage contexts for a framework and must instead specify their requirements. While not a static analysis technique, recent work by Wang et. al. [WM97] explicitly considers the issues that arrise in validating parameterized software frameworks. The notion of symbolically executing a framework using abstract parameters is similar in spirit to our use of parameter specifications in exhaustive static analysis of framework behavior [Dwy97].


next up previous
Next: 4 References Up: Automated Analysis of Software Previous: Analysis of Software Frameworks

Matthew B. Dwyer
Sept. 2, 1997