This page uses a stylesheet. Please use browsers that support stylesheets.


(In Conjunction with the International Conference on Dependable Systems and Networks, DSN2006)

Wednesday, June 28, 2006
Sheraton Society Hill, Philadelphia, PA, USA

Downloadable Call for Papers (35kB)


Empirical evaluation of dependability is a complement to modeling and analytical methods. Although empirical evaluation applies to real systems and is more realistic, more accurate, and provides a higher level of confidence, it is not extensively used, for multiple reasons: it is time and effort consuming; the results have limited portability; there are no benchmarks to define what measures to be collected, how to measure the data, how to report it, what are the models that indicate the ranges of "good" and "bad" values, and how to compare the values for different systems. Due to these challenges, very few results and empirical data are available for dependability evaluation, comparison, and benchmarking. Acknowledging these challenges, but also the need for empirical dependability evaluation, this workshop will bring together researchers and practitioners for sharing their solutions and results, as well as for documenting their needs, problems, and ideas with respect to this topic. The workshop will include presentations and a hands-on session where participants will work on finding solutions to pre-selected questions.


This workshop focuses on the empirical aspect of systems dependability evaluation. "Systems" include computers, networks and software, both in operation and under development. Outside the scope are process assessment and evaluation (e.g., CMM/CMMI type assessments). Dependability is considered a set of one or more properties of computing systems, including reliability, availability, safety, confidentiality, integrity, and maintainability. Our use of the term "dependability" includes security as well.


The objectives of this workshop are to:


Topics include, but are not limited to the following: There has been some work for addressing these questions for different dependability properties (e.g., reliability, safety, availability, security) and for different systems (i.e., for specific domain, applications, configurations, or platforms). Our goal is to identify what these efforts and results have in common and what can be transferred, applied, and learned from one to another.


The workshop is open to all researchers, system and software developers and acquirers, as well as to users who are involved with or have an interest in empirical evaluation of dependability. All prospective participants are required to submit an extended abstract presenting their current results, a work-in-progress report, or a position paper.

Submitted papers must be original work with no substantially overlap with papers that have been published or that are simultaneously submitted to a journal or conference with proceedings. Papers should be at most 4 pages in IEEE proceedings style (two-column pages, single space, using 11-point font and 1-inch margins) including all figures and references. Each submission should start with a title and names and contact information of the authors. Submissions must be made electronically in PDF or Postscript format, by sending an email to and

Submitted contributions will be fully refereed by PC members. Accepted papers will be published in the supplement volume of DSN 2006 proceedings, conditioned by the attendance of the author to the workshop.

Important Dates

Paper submission due March 7, 2006 (Tuesday)
Acceptance notification April 4, 2006 (Tuesday)
Camera-ready version of papers due April 18, 2006 (Tuesday)


Michel Cukier, University of Maryland
Ioana Rus, Fraunhofer Center Maryland


For more information on the location and on DSN06 please visit For workshop information please send email to the organizers, Michel Cukier at or Ioana Rus at


Matti Hiltunen, AT&T Labs, USA
Mohamed Kaaniche, LAAS-CNRS, France
Peter Lakey, Cognitive Concepts LLC, USA
Henrique Madeira, University of Coimbra, Portugal
Frank Marotta, U.S. Army Aberdeen Test Center, USA
John Murdoch, University of York, UK
Holger Peine, Fraunhofer IESE, Germany