Write a Blog >>
Thu 12 Nov 2020 01:30 - 01:32 at Virtual room 1 - Community

Keeping a good influx of newcomers is critical for open source software projects' survival, while newcomers face many barriers to contributing to a project for the first time. To support newcomers onboarding, GitHub encourages projects to apply labels such as good first issue (GFI) to tag issues suitable for newcomers. However, many newcomers still fail to contribute even after many attempts, which not only reduces the enthusiasm of newcomers to contribute but makes the efforts of project members in vain. To better support the onboarding of newcomers, this paper reports a preliminary study on this mechanism from its application status, effect, problems, and best practices. By analyzing 9,368 GFIs from 816 popular GitHub projects and conducting email surveys with newcomers and project members, we obtain the following results. We find that more and more projects are applying this mechanism in the past decade, especially the popular projects. Compared to common issues, GFIs usually need more days to be solved. While some newcomers really join the projects through GFIs, almost half of GFIs are not solved by newcomers. We also discover a series of problems covering mechanism (e.g., inappropriate GFIs), project (e.g., insufficient GFIs) and newcomer (e.g., uneven skills) that makes this mechanism ineffective. We discover the practices that may address the problems, including identifying GFIs that have informative description and available support, and require limited scope and skill, etc. Newcomer onboarding is an important but challenging question in open source projects and our work enables a better understanding of GFI mechanism and its problems, as well as highlights ways in improving them.

Thu 12 Nov
Times are displayed in time zone: (UTC) Coordinated Universal Time change

01:30 - 01:32
Talk
A First Look at Good First Issues on GitHub
Research Papers
Xin TanPeking University, China, Minghui ZhouPeking University, China, Zeyu SunPeking University, China
DOI
01:32 - 01:34
Talk
A Theory of the Engagement in Open Source Projects via Summer of Code Programs
Research Papers
Jefferson SilvaPUC-SP, Brazil, Igor WieseFederal University of Technology ParanĂ¡, Brazil, Daniel M. GermanUniversity of Victoria, Canada, Christoph TreudeUniversity of Adelaide, Australia, Marco GerosaNorthern Arizona University, USA, Igor SteinmacherNorthern Arizona University, USA
DOI
01:35 - 01:36
Talk
Biases and Differences in Code Review using Medical Imaging and Eye-Tracking: Genders, Humans, and Machines
Research Papers
Yu HuangUniversity of Michigan, Kevin LeachUniversity of Michigan, Zohreh SharafiUniversity of Michigan, Nicholas McKayUniversity of Michigan, USA, Tyler SantanderUniversity of California at Santa Barbara, Westley WeimerUniversity of Michigan, USA
DOI
01:37 - 01:38
Talk
Does Stress Impact Technical Interview Performance?
Research Papers
Mahnaz BehrooziNorth Carolina State University, USA, Shivani ShirolkarNorth Carolina State University, USA, Titus BarikMicrosoft, USA, Chris ParninNorth Carolina State University, USA
DOI
01:39 - 01:40
Talk
Reducing Implicit Gender Biases in Software Development: Does Intergroup Contact Theory Work?
Research Papers
Yi WangCoCo Labs, USA, Min ZhangEast China Normal University, China
DOI
01:41 - 02:00
Talk
Conversations on Community
Paper Presentations
Kelly BlincoeUniversity of Auckland, Mahnaz BehrooziNorth Carolina State University, USA, Xin TanPeking University, China, Yi WangRochester Institute of Technology, Yu HuangUniversity of Michigan, M: Peter RigbyConcordia University, Montreal, Canada