Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Software . 2025
Data sources: ZENODO
ZENODO
Software . 2025
Data sources: Datacite
ZENODO
Software . 2025
Data sources: Datacite
versions View all 2 versions
addClaim

Replication Package for "Empirically Evaluating the Impact of Object-Centric Breakpoints on the Debugging of Object-Oriented Programs"

Authors: Bourcier, Valentin; Rani, Pooja; Willembrinck Santander, Maximilian Ignacio; Bacchelli, Alberto; Costiou, Steven;

Replication Package for "Empirically Evaluating the Impact of Object-Centric Breakpoints on the Debugging of Object-Oriented Programs"

Abstract

Design Within the "design" directory, you will find the components of the experiment definition and the procedures used to transform and interpret the collected data. The directory is structured as follows: Design Procedures - Directory containing the description of the procedures used to filter out non-valid participation and transform the data Correctness_analysis.md - Explains the protocol to decide if a participant successfully completed each task. Time_computation_analysis.md - Explains the protocol to interpret and adjust the interruption time on each task. Questionnaires - Directory containing all the questionnaires as they were presented to the participants Demographic_questions.md Experiment_feedback.md Post-task_questions.md Tasks - Directory contains the description of each task composing the experiment and present one solution to debug Ammolite and Lights Out. Description Ammolite.md LightsOut.md Tutorial.md Warmup.md Solution Ammolite.md LightsOut.md R-scripts The "R-scripts" directory contains the scripts to reproduce the results of our experiment. To run the scripts, you must have R installed on your system. The scripts require a few libraries to be installed on your system, "effectsize", "ggplot2", "Hmisc" and "lobstr". If you don't have them installed, you can uncomment the following lines at the top of the file: ./R-scripts/reproduction.r. #install.packages("effectsize") #install.packages("ggplot2") #install.packages("Hmisc") #install.packages("lobstr") To launch the script, you must execute the following command at the root-directory of this replication package. > Rscript R-scripts/reproduction.r Data The data directory contains the extracted data generated by the R-scripts. The directory is structured as follows: . ├── Feedback - Additionnal feedbacks on the perception of the tasks │ ├── about-treatment-difficulty.md │ ├── ammolite-control-then-lightsout-treatment.md │ ├── grey-feedback.md │ └── lightsout-control-then-ammolite-treatment.md ├── data.csv - Contains the extraction of all questions and answers └── extracted-data ├── controls - The statistical tests results for the comparison of both tasks under the control condition │ ├── Controls-full-statistics.txt │ ├── control-actions-distribution.pdf │ └── control-times-distribution.pdf ├── demographics - Contains the repartition of the participants per-task depending on the different demographic factors. Please refer to questions-data-mapping.csv for more details about the demographic questions each file refers to. │ ├── Ammolite-education.csv │ ├── Ammolite-frequency.csv │ ├── Ammolite-jobs.csv │ ├── Ammolite-pharo-frequency.csv │ ├── Ammolite-pharo.csv │ ├── Ammolite-prog.csv │ ├── LightsOut-education.csv │ ├── LightsOut-frequency.csv │ ├── LightsOut-jobs.csv │ ├── LightsOut-pharo-frequency.csv │ ├── LightsOut-pharo.csv │ ├── LightsOut-prog.csv │ ├── code.frequency.csv │ ├── education.csv │ ├── familiarity.csv │ ├── job.position.csv │ ├── pharo.exp.csv │ ├── pharo.frequency.csv │ ├── program.exp.csv │ ├── total-jobs.csv │ └── total-xp.csv ├── experiment-feedback │ └── experiment-feedback.csv - Contains the answers given by participants to the post-experiment questions. The results are presented by number/frequency of apparition of the different possible answers. ├── post-task-feedback - Contains the answers given by participants to the post-task questionnaires. The results are presented by number/frequency of apparition of the different possible answers, after the control task and the treatment task. │ ├── ammolite-agree-disagree.csv │ ├── ammolite-help.csv │ ├── ammolite-yes-no.csv │ ├── lightsout-agree-disagree.csv │ ├── lightsout-help.csv │ └── lightsout-yes-no.csv ├── questions-data-mapping.csv - Maps the name of each CSV file with the questions asked after each tasks and at the end of the experiment └── statistics - Contains the results of the statical tests (normality, significance) performed on Ammolite and Lights Out │ ├── Ammolite-full-statistics.txt - Control-Treatment comparison for the Ammolite task │ ├── LightsOut-full-statistics.txt - Control-Treatment comparison for the Lights Out task │ ├── Ammolite-demographics-statistics.txt - Control-Treatment demographics for the Ammolite task (only chi-squared p-values) │ └── Ammolite-full-statistics.txt - Control-Treatment demographics for the Lights Out task (only chi-squared p-values) └── tools-usage - Contains the tools usage frequency per task and the statistical tests checking for a difference in tools usage between control and treatment (with the Benjamini-Hochberg procedure for limiting the false discovery rate). ├── BenjaminiH-procedure-for-tools-usage-Ammolite.pdf ├── BenjaminiH-procedure-for-tools-usage-LightsOut.pdf ├── BenjaminiH-procedure-for-tools-usage.xlsx └── detailed-tools-usage.csv

Debugging consists in understanding the behavior of a program to identify and correct its defects.Breakpoints are the most commonly used debugging tool and aim to facilitate the debugging process by allowing developers to interrupt a program's execution at a source code location of their choice and inspect the state of the program. Researchers suggest that in systems developed using object-oriented programming (OOP), traditional breakpoints may be a not effective method for debugging. In OOP, developers create code in classes, which at runtime are instantiated as object---entities with their own state and behavior that can interact with one another. Traditional breakpoints are set within the class code, halting execution for every object that shares that class’s code. This leads to unnecessary interruptions for developers who are focused on monitoring the behavior of a specific object. As an answer to this challenge, researchers proposed object-centric debugging, an approach based on debugging tools that focus on objects rather than classes.In particular, using object-centric breakpoints, developers can select specific objects (rather than classes) for which the execution must be interrupted.Even though it seems reasonable that this approach may ease the debugging process by reducing the time and actions needed for debugging objects, no research has yet verified its actual impact. To investigate the impact of object-centric breakpoints on the debugging process, we devised and conducted a controlled experiment with 81 developers who spent an average of 1 hour and 30 minutes each on the study.The experiment required participants to complete two debugging tasks using debugging tools with vs. without object-centric breakpoints. We found no significant effect from the use of object-centric breakpoints on the number of actions required to debug or the effectiveness in understanding or fixing the bug. However, for one of the two tasks, we measured a statistically significant reduction in debugging time for participants who used object-centric breakpoints, while for the other task, there was a statistically significant increase. Our analysis suggests that the impact of object-centric breakpoints varies depending on the context and the specific nature of the bug being addressed. In particular, our analysis indicates that object-centric breakpoints can speed up the process of locating the root cause of a bug when the bug can be replicated without needing to restart the program. We discuss the implications of these findings for debugging practices and future research.

This repository provides the necessary materials to replicate the empirical experiment analysis presented in the paper "Empirically Evaluating the Impact of Object-Centric Breakpoints on the Debugging of Object-Oriented Programs". It includes instructions, data, and artifacts to enable reproduction of the analysis. Additionally, the repository contains PDF files with diagrams of the extracted data whose numbers were reported in the paper. The files are primarily in Markdown format, we recommend reading them with a Markdown renderer. For those who prefer a more traditional format, PDF versions are also available.

Keywords

Breakpoints, Controlled experiment, Debugging, Debugging tools, Empirical evaluation, Object-centric debugging, Object-oriented programming, Object-centric breakpoints

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    1
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
1
Average
Average
Average