Over the last decade, research on Cyber-Physical Systems (CPS) and Internet of Things (IoT) has led to smart systems at different scales and environments, from smart homes to smart cities and smart factories. Despite many successes, it is difficult to measure and compare the utility of these results due to a lack of standard evaluation criteria and methodologies. This problem inhibits evaluation against the state-of-the-art, the comparability of different integrated designs (e.g., control and networking), and the applicability of tested scenarios to present and future real-world cyber-physical applications and deployments. This state of affairs is alarming, as it significantly hinders further progress in CPS and IoT systems research.
The Workshop on Benchmarking Cyber-Physical Systems and Internet of Things (CPS-IoTBench) brings together researchers from the different sub-communities of CPS-IoTweek to engage in a lively debate on all facets of rigorously evaluating and comparing experimental results on cyber-physical networks and systems.
Following the success of last year, we once again will have a special session on the benchmarking and reproducibility of AI and machine learning techniques in CPS and IoT systems
Call for Papers and Reproducibility Studies
For the workshop's 6th edition, we invite researchers and practitioners from academia and industry to submit papers (up to 6 pages, double-column) focusing on one of the following topics:
- Identify fundamental challenges and open questions in CPS and IoT systems through rigorous benchmarking and evaluation.
- Expose key issues (or support current practice!) on the application and reproducibility of AI and machine learning techniques in CPS and IoT systems.
- Report on success stories or failures with using standard evaluation criteria.
- Present example benchmark systems and approaches from any of the relevant communities (embedded systems, networking, low-power wireless, control, robotics, machine learning, etc.).
- Propose new research directions, methodologies, or tools to increase the level of reproducibility and comparability of evaluation results.
- Report on examples of best practices in different CPS-IoT sub-communities towards achieving the repeatability of results.
- Present models to capture and compare the properties of algorithms and systems.
Well-reasoned arguments or preliminary evaluations are sufficient to support a paper’s claims.
Unique to this year's workshop edition, we also seek contributions describing the reproduction of experimental results from published work within any of the relevant communities (embedded systems, networking, AI and machine learning, low-power wireless, control, robotics, etc.). Examples of this include:
- Reproducibility studies of a given scientific paper or article.
- Evaluations carried out by researchers or industry practitioners looking to validate another piece of work before comparing their own to it.
- Reproducibility studies of a given scientific work for which some key aspect(s) in the experimental setup was unspecified, showing the different spectrum of solutions as a function of such aspect(s).
Such reports can be submitted using up to 6 pages, double-column format, and should carefully describe the experimental setup as well as how the authors made sure to carefully reproduce the same conditions as in the original study. Regardless of the study results, the tone of the report will be constructive and not offensive with respect to the original authors. Prospective authors are invited to contact the original authors of the examined study and let them provide a short rebuttal paragraph that can be included in the final section of the report.
Submission Instructions
Submitted papers must contain at most 6 pages (US letter, 9pt font size, double-column format, following the ACM master article template), including all figures, tables, and references. All submissions must be written in English and should contain the authors' names, affiliations, and contact information. Accepted papers will be published in the ACM Digital Library as part of the CPS-IoT Week 2023 proceedings.
Authors of accepted papers are expected to present their work in a plenary session as part of the main workshop program.
The submission website is available at cps-iotbench2023.hotcrp.com.
1. By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all ACM Publications Policies, including ACM's new Publications Policy on Research Involving Human Participants and Subjects. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.
https://www.acm.org/publications/policies/research-involving-human-participants-and-subjects
2. Please ensure that you and your co-authors obtain an ORCID ID so that you can complete the publishing process for your accepted paper. ACM has been involved in ORCID from the start and we have recently made a commitment to collect ORCID IDs from all of our published authors. The collection process has started and will roll out as a requirement throughout 2022. We are committed to improving author discoverability, ensuring proper attribution, and contributing to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.
https://orcid.org/register