First international Competition on Runtime Verification: rules, benchmarks, tools, and final results of CRV 2014

Ezio Bartocci, Yliès Falcone, Borzoo Bonakdarpour, Christian Colombo, Normann Decker, Klaus Havelund, Yogi Joshi, Felix Klaedtke, Reed Milewicz, Giles Reger, Grigore Rosu, Julien Signoles, Daniel Thoma, Eugen Zalinescu, Yi Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

The first international Competition on Runtime Verification (CRV) was held in September 2014, in Toronto, Canada, as a satellite event of the 14th international conference on Runtime Verification (RV’14). The event was organized in three tracks: (1) offline monitoring, (2) online monitoring of C programs, and (3) online monitoring of Java programs. In this paper, we report on the phases and rules, a description of the participating teams and their submitted benchmark, the (full) results, as well as the lessons learned from the competition.

Original languageEnglish
Pages (from-to)1-40
Number of pages40
JournalInternational Journal on Software Tools for Technology Transfer
Early online date7 Apr 2017
DOIs
Publication statusPublished - 2017

Keywords

  • Benchmarks
  • Monitoring
  • Runtime Verification
  • Software competition

Fingerprint

Dive into the research topics of 'First international Competition on Runtime Verification: rules, benchmarks, tools, and final results of CRV 2014'. Together they form a unique fingerprint.

Cite this