Artifact Evaluation Track
ICPE 2024 welcomes the submission of artifacts to the ICPE Artifact Track. According to ACM’s “Result and Artifact Review and Badging” policy, an “artifact” is “a digital object that was either created by the authors to be used as part of the study or generated by the experiment itself […] software systems, scripts used to run experiments, input datasets, raw data collected in the experiment, or scripts used to analyze results.” A formal review of such artifacts not only ensures that the “study is repeatable” by the same team, if they are available online then other researchers “can replicate the findings” as well and this also allows reuse by other teams. For more information about artifact reviews and badging, please refer to the ACM policy page: https://www.acm.org/publications/policies/artifact-review-and-badging-current
In this spirit, the ICPE 2024 Artifacts Track exists to review, promote, share and catalog the research artifacts of interest to the Performance Engineering community.
Types of artifact submissions
- Research paper artifacts (accompanying accepted papers in the Research or Industry track)
- Already have a research paper in the proceedings
- The research paper will acquire a badge on artifact acceptance
- Tool artifact
- A paper of up to 4 pages in the ACM Proceedings format, not including references, that describes the artifact, its utility, benefits, and success stories of its utilization. The paper must provide a link from where the artifact can be downloaded. At submission time, a non-permanent link (e.g., to a personal page or GitHub project) is accepted. An extra page is allowed for references.
- This category of artifacts includes tools, libraries, scripts, benchmarks, frameworks. The authors must provide instructions for installing (including software and hardware requirements) and using the artifact on the same website from where the artifact is downloaded or in the same bundle.
- Data artifact
- A paper of up to 4 pages in the ACM Proceedings format, not including references, describing the dataset, its format in a clear way, its utility, relevance for its reuse by the performance engineering research community, data acquisition process, and guidelines or examples on how to use it to foster trustability. The paper must provide a link from where the dataset can be downloaded, its utility and success stories of its utilization. At submission time, a non-permanent link (e.g., to a personal page or GitHub project) is accepted. An extra page is allowed for references.
To facilitate the review, the artifacts may include, but are not limited to, source code, executables, datasets, a virtual machine image, and documents (please use open formats for documents). Please make sure that your artifact is self-contained (with the exception of pointers to external tools or libraries, which we will not consider being part of the evaluated artifact, but which we will try to use when evaluating the artifact). All submitted artifacts must be open-access (meaning that they are available for download without charge), and open-source licenses are strongly preferred. By the camera-ready deadline, the artifacts must be available from a permanent URL or DOI with an archival plan, such as the Zenodo repository (a personal page is not sufficient). Note that public source repositories such as GitHub are not archival services; while authors may most copies of the artifact in such services for convenience, a true archival service with guarantees regarding long-term storage is required.
If you require an exception from the conditions above, please mail the chairs before submitting them at email@example.com.
What do you get out of it?
Types tool artifact and data artifact
If your artifact is accepted, your artifact description paper will be published in the ICPE 2024 Companion Proceedings. The paper will receive one of the badges described in Types of badges.
Type research paper artifacts
If your artifact is accepted, your accepted research paper will receive one of the following badges in the text of the paper and the ACM Digital Library.
Types of badges awarded
The types of badges that papers can receive (defined by the ACM Artifact Review and Badging policy v1.1)
Artifacts Available: This badge is applied to papers in which associated artifacts have been made permanently available for retrieval.
Artifacts Evaluated - Functional: The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and validation.
Artifacts Evaluated - Results Replicated (Paper Artifacts only): The main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author.
Artifacts Evaluated - Reusable: The artifacts associated with the paper are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
Regarding archival, all accepted artifacts will be indexed on the conference website.
How to submit?
Submissions are made via https://icpe24-artifacts.hotcrp.com/.
Submission deadlines are listed on the important dates page.
The artifact evaluation committee for ICPE 2024 is participating in a study to better understand how research artifacts and scholarly infrastructure can support reproducibility. The goal of the study is to help make artifact preparation and evaluation as simple as possible. You can support this work without any additional effort on your part by allowing a research team from the University of Utah to observe the evaluation of your artifact submission. More information on the study, its risks and benefits, can be found here. Questions can be directed to Dr. Hannah Cohoon at firstname.lastname@example.org. When submitting your artifact, you will be asked if you consent to participating in this study—you do not need to consent in order to submit an artifact. Your participation will have no impact on how your artifact is evaluated.
Types tool artifact and data artifact: We request the submission of the max. 4-page ACM format paper describing the artifact. The paper must include the link to where the artifact should be downloaded. Since the paper will be published in the ICPE 2024 Companion Proceedings, it does not need to include the prerequisites or the instructions to install and execute the artifact. These instructions must be provided on the linked website or bundle (e.g. in a README.md file).
Type research paper artifacts: We request a PDF submission that includes: a) a first page, in any format, that includes a brief summary of the artifact and the link to the artifact; b) the camera-ready version of the accompanying research paper, so that the reviewers can properly judge the artifact. As in the previous types of artifacts, the hardware and software prerequisites as well as the instructions to install and run the artifacts must be provided on the linked website or bundle (e.g. in a README.md file).
Video: Optionally, authors of all types of artifacts are encouraged to submit a link to a short video (maximum 5 minutes) demonstrating the artifact. For tool and data artifacts, the link to the video is included in the 4-page paper. For research paper artifacts, the link should appear on the first page of the submitted PDF.
To submit an artifact it is important to keep in mind: a) how accessible you are making your artifact to other researchers, and b) that the ICPE artifact evaluators will have very limited time for making an assessment of each artifact. Artifact evaluation can be rejected for artifacts whose configuration and installation takes an undue amount of time. If you envision difficulties, please provide your artifact in some easily ported form, such as a virtual machine image (https://www.virtualbox.org) or a container image (https://www.docker.com).
Review Process and Selection Criteria
The artifact will go through an anonymous review. Artifacts associated with a research paper will be evaluated in relation to the expectations set by the paper. Standalone tool and data artifacts Submitted artifacts will go through a two-phase evaluation:
Kicking the tires: reviewers check the artifact integrity and look for any possible setup problems that may prevent it from being properly evaluated (e.g., corrupted or missing files, VM does not start, immediate crashes, or platform configuration problems to run the artifact as the simplest examples). Authors are informed of the outcome of this phase and can help to resolve found issues during a brief author response period.
Artifact assessment: reviewers evaluate the artifacts, checking if they live up to the expectations created by the paper.
This process will involve more interaction between the evaluators and authors than a typical paper review process: the review committee may issue additional requests to authors during the assessment to fix such bugs in the artifact. The resulting version of the artifact should be considered “final” and should allow reviewers to decide about artifact acceptance and badges. For accepted tool and data artifacts, it will be requested a camera ready version of the paper that must include the link to a permanent repository.
Artifacts will be scored using the following criteria
For full details of evaluation criteria, please see the ACM Artifact Review and Badging policy, version 1.1.
- Author-created artifacts have been placed on a publicly accessible archival repository. A DOI or link to this repository along with a unique identifier of the object is provided.
Artifacts Evaluated - Functional
- Documented: Is it accompanied by relevant documentation making it easy to use?
- Consistent: Is the artifact relevant to the associated paper (the accepted paper or the standalone 4-page submitted artifact paper), and contribute in some inherent way to the generation of its main research results or motivating success stories?
- Complete: To the extent possible, are all components relevant to the research or artifact paper in question included? (Proprietary artifacts need not be included. If they are required to exercise the package then this should be documented, along with instructions on how to obtain them. Proxies for proprietary data should be included so as to demonstrate the analysis).
- Exercisable: If the artifact is executable, is it easy to download, install, or execute? Included scripts and/or software used to generate the results in the associated paper can be successfully executed, and included data can be accessed and appropriately manipulated.
Artifacts Evaluated: Results Reproduced
- Applies only to artifacts submitted alongside Research or Industry papers
- The main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author.The main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author.
Artifacts Evaluated - Reusable
- The artifacts are of a quality that significantly exceeds minimal functionality. That is, they have all the qualities of the Artifacts Evaluated – Functional level, but, in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
Instructions for Authors from ACM
By submitting your article to an ACM Publication, you are hereby acknowledging that you and your co-authors are subject to all ACM Publications Policies, including ACM’s new Publications Policy on Research Involving Human Participants and Subjects. Alleged violations of this policy or any ACM Publications Policy will be investigated by ACM and may result in a full retraction of your paper, in addition to other potential penalties, as per ACM Publications Policy.
Please ensure that you and your co-authors obtain an ORCID ID, so you can complete the publishing process for your accepted paper. ACM has been involved in ORCID from the start and we have recently made a commitment to collect ORCID IDs from all of our published authors.
The collection process has started and will roll out as a requirement throughout 2022. We are committed to improve author discoverability, ensure proper attribution and contribute to ongoing community efforts around name normalization; your ORCID ID will help in these efforts.
Artifact Evaluation Track Chairs
- Robert Ricci, University of Utah, USA
- Dmitry Duplyakin, National Renewable Energy Laboratory, USA
Submissions to be made via https://icpe24-artifacts.hotcrp.com/.
We are accepting self-nominations to serve on the committee: Please nominate yourself at https://forms.gle/oTkM8MgFwSe8uRwm8 and share this information with any others who you think would be interested.
Studying of Artifact Evaluation Process
The ICPE 2024 Artifact Evaluation Track Chairs and a team of researchers from the University of Utah are trying to better understand how research artifacts and scholarly infrastructure can support reproducibility. Artifact evaluators can support this work by allowing the research team to observe your communication over HotCRP with authors and other reviewers; other opportunities for participation are available as well. We expect that as a result of this research, we will be able to streamline the artifact evaluation process in future. For more information on the procedures, benefits, and risks of participating in this study, please click here. When applying to serve as an artifact evaluator, you will be asked if you consent to participating. Questions can be directed to Dr. Hannah Cohoon at email@example.com.