publication . Conference object . Preprint . 2020

Code Duplication and Reuse in Jupyter Notebooks

Andreas P. Koenzen; Neil A. Ernst; Margaret-Anne Storey;
Open Access English
  • Published: 29 May 2020
Abstract
Duplicating one's own code makes it faster to write software. This expediency is particularly valuable for users of computational notebooks. Duplication allows notebook users to quickly test hypotheses and iterate over data. In this paper, we explore how much, how and from where code duplication occurs in computational notebooks, and identify potential barriers to code reuse. Previous work in the area of computational notebooks describes developers' motivations for reuse and duplication but does not show how much reuse occurs or which barriers they face when reusing code. To address this gap, we first analyzed GitHub repositories for code duplicates contained in a repository's Jupyter notebooks, and then conducted an observational user study of code reuse, where participants solved specific tasks using notebooks. Our findings reveal that repositories in our sample have a mean self-duplication rate of 7.6%. However, in our user study, few participants duplicated their own code, preferring to reuse code from online sources.
Comment: Accepted as a full paper at the IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) 2020
Subjects
free text keywords: Computer Science - Software Engineering, Computer Science - Human-Computer Interaction, Jupyter, computational notebooks, code duplication, code clones, code reuse, data analysis, data exploration, exploratory programming, Reuse, Code (cryptography), Data exploration, Software engineering, business.industry, business, Software, Sample (statistics), Exploratory programming, Duplicate code, Code reuse, Computer science
Funded by
NSERC
Project
  • Funder: Natural Sciences and Engineering Research Council of Canada (NSERC)
Download fromView all 4 versions
Open Access
https://doi.org/10.5281/zenodo...
Conference object . 2020
Providers: Datacite
Open Access
ZENODO
Conference object . 2020
Providers: ZENODO
45 references, page 1 of 3

[1] J. M. Perkel, “Why jupyter is data scientists' computational notebook of choice,” Nature, vol. 563, no. 7729, pp. 145-146, Oct 2018. [Online]. Available: http://dx.doi.org/10.1038/d41586-018-07196-1 [OpenAIRE]

[2] S. Kandel, A. Paepcke, J. M. Hellerstein, and J. Heer, “Enterprise data analysis and visualization: An interview study,” IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 12, pp. 2917-2926, Dec 2012. [Online]. Available: http://dx.doi.org/10.1109/TVCG.2012. 219 [OpenAIRE]

[3] M. B. Kery and B. A. Myers, “Exploring exploratory programming,” in 2017 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), Oct 2017, pp. 25-29.

[4] J. Brandt, P. J. Guo, J. Lewenstein, and S. R. Klemmer, “Opportunistic programming,” Proceedings of the 4th international workshop on Enduser software engineering - WEUSE '08, 2008. [Online]. Available: http://dx.doi.org/10.1145/1370847.1370848

[5] C. K. Roy and J. R. Cordy, “A survey on software clone detection research,” School of Computing TR 2007-541, Queen's University, vol. 115, 2007.

[6] M. Fowler, Refactoring: improving the design of existing code. Addison-Wesley Professional, 2018.

[7] M. B. Kery, M. Radensky, M. Arya, B. E. John, and B. A. Myers, “The story in the notebook: Exploratory data science using a literate programming tool,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ser. CHI '18. New York, NY, USA: ACM, 2018, pp. 174:1-174:11. [Online]. Available: http://doi.acm.org/10.1145/3173574.3173748 [OpenAIRE]

[8] A. Rule, A. Birmingham, C. Zuniga, I. Altintas, S.-C. Huang, R. Knight, N. Moshiri, M. H. Nguyen, S. B. Rosenthal, F. Pe´rez, and et al., “Ten simple rules for writing and sharing computational analyses in jupyter notebooks,” PLOS Computational Biology, vol. 15, no. 7, p. e1007007, Jul 2019. [Online]. Available: http://dx.doi.org/10.1371/ journal.pcbi.1007007

[9] J. F. Pimentel, L. Murta, V. Braganholo, and J. Freire, “A largescale study about quality and reproducibility of jupyter notebooks,” 2019 IEEE/ACM 16th International Conference on Mining Software Repositories (MSR), May 2019. [Online]. Available: http://dx.doi.org/ 10.1109/MSR.2019.00077

[10] R. Koschke, “Survey of research on software clones,” in Duplication, Redundancy, and Similarity in Software, ser. Dagstuhl Seminar Proceedings, R. Koschke, E. Merlo, and A. Walenstein, Eds., no. 06301. Dagstuhl, Germany: Internationales Begegnungs- und Forschungszentrum fu¨ r Informatik (IBFI), Schloss Dagstuhl, Germany, 2007. [Online]. Available: http://drops.dagstuhl.de/opus/volltexte/2007/ 962 [OpenAIRE]

[11] S. Uchida, A. Monden, N. Ohsugi, T. Kamiya, K.-i. Matsumoto, and H. Kudo, “Software analysis by code clones in open source software,” Journal of Computer Information Systems, vol. XLV, pp. 1-11, 04 2005.

[12] “Google colab.” [Online]. Available: https://colab.research.google.com

[13] “Google cloud ai platform.” [Online]. Available: https://cloud.google. com

[14] “Microsoft azure notebooks.” [Online]. Available: https://notebooks. azure.com

[15] “Databricks.” [Online]. Available: https://databricks.com

45 references, page 1 of 3
Abstract
Duplicating one's own code makes it faster to write software. This expediency is particularly valuable for users of computational notebooks. Duplication allows notebook users to quickly test hypotheses and iterate over data. In this paper, we explore how much, how and from where code duplication occurs in computational notebooks, and identify potential barriers to code reuse. Previous work in the area of computational notebooks describes developers' motivations for reuse and duplication but does not show how much reuse occurs or which barriers they face when reusing code. To address this gap, we first analyzed GitHub repositories for code duplicates contained in a repository's Jupyter notebooks, and then conducted an observational user study of code reuse, where participants solved specific tasks using notebooks. Our findings reveal that repositories in our sample have a mean self-duplication rate of 7.6%. However, in our user study, few participants duplicated their own code, preferring to reuse code from online sources.
Comment: Accepted as a full paper at the IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC) 2020
Subjects
free text keywords: Computer Science - Software Engineering, Computer Science - Human-Computer Interaction, Jupyter, computational notebooks, code duplication, code clones, code reuse, data analysis, data exploration, exploratory programming, Reuse, Code (cryptography), Data exploration, Software engineering, business.industry, business, Software, Sample (statistics), Exploratory programming, Duplicate code, Code reuse, Computer science
Funded by
NSERC
Project
  • Funder: Natural Sciences and Engineering Research Council of Canada (NSERC)
Download fromView all 4 versions
Open Access
https://doi.org/10.5281/zenodo...
Conference object . 2020
Providers: Datacite
Open Access
ZENODO
Conference object . 2020
Providers: ZENODO
45 references, page 1 of 3

[1] J. M. Perkel, “Why jupyter is data scientists' computational notebook of choice,” Nature, vol. 563, no. 7729, pp. 145-146, Oct 2018. [Online]. Available: http://dx.doi.org/10.1038/d41586-018-07196-1 [OpenAIRE]

[2] S. Kandel, A. Paepcke, J. M. Hellerstein, and J. Heer, “Enterprise data analysis and visualization: An interview study,” IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 12, pp. 2917-2926, Dec 2012. [Online]. Available: http://dx.doi.org/10.1109/TVCG.2012. 219 [OpenAIRE]

[3] M. B. Kery and B. A. Myers, “Exploring exploratory programming,” in 2017 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), Oct 2017, pp. 25-29.

[4] J. Brandt, P. J. Guo, J. Lewenstein, and S. R. Klemmer, “Opportunistic programming,” Proceedings of the 4th international workshop on Enduser software engineering - WEUSE '08, 2008. [Online]. Available: http://dx.doi.org/10.1145/1370847.1370848

[5] C. K. Roy and J. R. Cordy, “A survey on software clone detection research,” School of Computing TR 2007-541, Queen's University, vol. 115, 2007.

[6] M. Fowler, Refactoring: improving the design of existing code. Addison-Wesley Professional, 2018.

[7] M. B. Kery, M. Radensky, M. Arya, B. E. John, and B. A. Myers, “The story in the notebook: Exploratory data science using a literate programming tool,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ser. CHI '18. New York, NY, USA: ACM, 2018, pp. 174:1-174:11. [Online]. Available: http://doi.acm.org/10.1145/3173574.3173748 [OpenAIRE]

[8] A. Rule, A. Birmingham, C. Zuniga, I. Altintas, S.-C. Huang, R. Knight, N. Moshiri, M. H. Nguyen, S. B. Rosenthal, F. Pe´rez, and et al., “Ten simple rules for writing and sharing computational analyses in jupyter notebooks,” PLOS Computational Biology, vol. 15, no. 7, p. e1007007, Jul 2019. [Online]. Available: http://dx.doi.org/10.1371/ journal.pcbi.1007007

[9] J. F. Pimentel, L. Murta, V. Braganholo, and J. Freire, “A largescale study about quality and reproducibility of jupyter notebooks,” 2019 IEEE/ACM 16th International Conference on Mining Software Repositories (MSR), May 2019. [Online]. Available: http://dx.doi.org/ 10.1109/MSR.2019.00077

[10] R. Koschke, “Survey of research on software clones,” in Duplication, Redundancy, and Similarity in Software, ser. Dagstuhl Seminar Proceedings, R. Koschke, E. Merlo, and A. Walenstein, Eds., no. 06301. Dagstuhl, Germany: Internationales Begegnungs- und Forschungszentrum fu¨ r Informatik (IBFI), Schloss Dagstuhl, Germany, 2007. [Online]. Available: http://drops.dagstuhl.de/opus/volltexte/2007/ 962 [OpenAIRE]

[11] S. Uchida, A. Monden, N. Ohsugi, T. Kamiya, K.-i. Matsumoto, and H. Kudo, “Software analysis by code clones in open source software,” Journal of Computer Information Systems, vol. XLV, pp. 1-11, 04 2005.

[12] “Google colab.” [Online]. Available: https://colab.research.google.com

[13] “Google cloud ai platform.” [Online]. Available: https://cloud.google. com

[14] “Microsoft azure notebooks.” [Online]. Available: https://notebooks. azure.com

[15] “Databricks.” [Online]. Available: https://databricks.com

45 references, page 1 of 3
Any information missing or wrong?Report an Issue