Keeping a good influx of newcomers is critical for open source software projects' survival, while newcomers face many barriers to contributing to a project for the first time. To support newcomers onboarding, GitHub encourages projects to apply labels such as good first issue (GFI) to tag issues suitable for newcomers. However, many newcomers still fail to contribute even after many attempts, which not only reduces the enthusiasm of newcomers to contribute but makes the efforts of project members in vain. To better support the onboarding of newcomers, this paper reports a preliminary study on this mechanism from its application status, effect, problems, and best practices. By analyzing 9,368 GFIs from 816 popular GitHub projects and conducting email surveys with newcomers and project members, we obtain the following results. We find that more and more projects are applying this mechanism in the past decade, especially the popular projects. Compared to common issues, GFIs usually need more days to be solved. While some newcomers really join the projects through GFIs, almost half of GFIs are not solved by newcomers. We also discover a series of problems covering mechanism (e.g., inappropriate GFIs), project (e.g., insufficient GFIs) and newcomer (e.g., uneven skills) that makes this mechanism ineffective. We discover the practices that may address the problems, including identifying GFIs that have informative description and available support, and require limited scope and skill, etc. Newcomer onboarding is an important but challenging question in open source projects and our work enables a better understanding of GFI mechanism and its problems, as well as highlights ways in improving them.
Thu 12 NovDisplayed time zone: (UTC) Coordinated Universal Time change
01:30 - 02:00 | |||
01:30 2mTalk | A First Look at Good First Issues on GitHub Research Papers Xin Tan Peking University, China, Minghui Zhou Peking University, China, Zeyu Sun Peking University, China DOI | ||
01:32 2mTalk | A Theory of the Engagement in Open Source Projects via Summer of Code Programs Research Papers Jefferson Silva PUC-SP, Brazil, Igor Wiese Federal University of Technology ParanĂ¡, Brazil, Daniel M. German University of Victoria, Canada, Christoph Treude University of Adelaide, Australia, Marco Gerosa Northern Arizona University, USA, Igor Steinmacher Northern Arizona University, USA DOI | ||
01:35 1mTalk | Biases and Differences in Code Review using Medical Imaging and Eye-Tracking: Genders, Humans, and Machines Research Papers Yu Huang University of Michigan, Kevin Leach University of Michigan, Zohreh Sharafi University of Michigan, Nicholas McKay University of Michigan, USA, Tyler Santander University of California at Santa Barbara, Westley Weimer University of Michigan, USA DOI | ||
01:37 1mTalk | Does Stress Impact Technical Interview Performance? Research Papers Mahnaz (Mana) Behroozi North Carolina State University, USA, Shivani Shirolkar North Carolina State University, USA, Titus Barik Microsoft, USA, Chris Parnin North Carolina State University, USA DOI | ||
01:39 1mTalk | Reducing Implicit Gender Biases in Software Development: Does Intergroup Contact Theory Work? Research Papers DOI | ||
01:41 19mTalk | Conversations on Community Paper Presentations Kelly Blincoe University of Auckland, Mahnaz (Mana) Behroozi North Carolina State University, USA, Xin Tan Peking University, China, Yi Wang Rochester Institute of Technology, Yu Huang University of Michigan, M: Peter Rigby Concordia University, Montreal, Canada |