2nd International Symposium on Leveraging Applications
of Formal Methods, Verification and Validation
15-19 November 2006 - Coral Beach Resort (Paphos, Cyprus)
The purpose of the WCET Tool Challenge is to be able to study, compare and discuss the properties of different WCET tools and approaches, to define common metrics, and to enhance the existing benchmarks. The WCET Tool Challenge has been designed to find a good balance between openness for a wide range of analysis approaches, and specific participation guidelines to provide a level playing field. This should make results transparent and facilitate friendly competition among the participants. However, participants and other interested parties should be aware that results from different WCET tools will still be hard to compare directly, and there is not yet an established classification or set of performance metrics in this field. Therefore, the purpose of this Challenge is not to establish "winning tools". For a more detailed discussion of the actual goals, see here.
The WCET Tool Challenge will be performed during the autumn of 2006. This specification of the WCET Tool Challenge is based on discussions at the WCET 2006 workshop and the ARTIST2 Timing Analysis group meeting in Dresden in the beginning of July 2006.
The WCET Tool Challenge will concentrate on three aspects of WCET analysis (more info below):
Companies as well as research groups are welcome to participate. The actual work with the tools will be made by an external student (see below) and/or the development teams. We will target the evaluation on a set of benchmark programs.
The report to ISoLA 2006 will be based on the reports from the developers and the student. The report will be compiled by the working group.
WCET tool developers are asked to enroll by sending an email to Jan Gustafsson no later than 2006-08-31. In your email, please state information and your choices according to the following:
*/ The WCET survey has a detailed treatment of the classification and clarifies this point.
More information concerning some of these bullets are found further down in this document.
More detailed directions as decided by the working team will be sent out during August, as well as a more detailed time schedule. If you plan to do the measurements by the development team, your results should be sent in during first half of October, to be able to produce the summary report to ISoLA 2006.
A working group is set up with Dr. Jan Gustafsson, Mälardalen University, Prof. Dr. Reinhard Wilhelm, Saarland, Prof. Dr. Reinhard v. Hanxleden, Kiel University, Dr. rer. nat. Steffen Goerzig, DaimlerChrysler, and Prof. Dr. Paul Levi, Stuttgart University, as current members. We are currently looking for a student who would set up the logistics and do the experimentation supervised by a competent person.
Area 1: Flow analysis
The purpose of the flow analysis phase is to extract the dynamic behaviour of the program. This includes information on which functions get called, loops bounds, if there are dependencies between if-statements, etc. We propose the following flow analysis metrics to be measured:
The required user interaction is of course growing for each round. For each round, the metrics are measured for the three aspects. For each metric, the complete setup is described.
There will be a choice for the participants to use one or both of these approaches.
There will be two main types of codes:
Since this is the first event of this kind, we expect some difficulties when analysing the software. For example, not all benchmarks have been tested with all tools. Please have patience and see this as a part of the learning process. Analyse as many as possible of the benchmarks, and try to solve the problems as they appear. There are a number of considerations when analysing the benchmarks, see here and here.
We suggest that each participant selects up to three processors for which to do the analyses; for example one simple (e.g., Renesas H8), one medium complex (e.g., ARM7/9, C167NEC, V850E) and one very complex (e.g., PowerPC), if possible.
To be able to compare results, we propose that the most commonly supported processors are selected. See here for a list of currently supported processors in WCET tools. If possible, avoid processors supported by only one tool.
We are aware that not all WCET tools support all processors, and that results will sometimes be hard to compare.
As there is no overview over which compilers are supported by which tools, we let the participants decide on one or two compiler(s). We ask you to choose some common compilers for the chosen processors, if possible.
We are aware that the results will become hard to compare since we cannot force the usage of certain compilers.
Welcome to participate in the WCET Tool Challenge 2006!
The Working Team
Jan Gustafsson <jan DOT gustafsson AT mdh DOT se>, Reinhard Wilhelm <wilhelm AT cs DOT uni-sb DOT de>, Reinhard v. Hanxleden < rvh AT informatik DOT uni-kiel DOT de>, Steffen Goerzig <steffen DOT goerzig AT daimlerchrysler DOT com>, and Paul Levi <Paul DOT Levi AT informatik DOT uni-stuttgart DOT de>