Open Source Survey §
Pattern Category §
- Community building
- OSS Discovery
OSPO Problems / Challenges §
- A lack of up-to-date information on how much open source software is being produced in a university.
- A lack of a coherent campus open source community.
- Uncertainty about what types of support might be most desired by the campus community.
Context §
A research university creating large volumes of research outputs across every discipline.
University policy allows research teams to release their software as open source without requiring institutional permission. This approach fosters innovation but results in limited visibility into the scope and impact of open source contributions across the university.
The lack of visibility makes it difficult for the OSPO to build community, foster collaboration, and contribute to the overall sustainability of open source efforts on campus.
Forces §
Many labs and projects are expert practitioners in open source and open science. However, they may not be known or receiving due recognition outside of siloed networks within their university.
Researchers and development teams on open science/open source projects are overburdened and their needs have not been explored.
Solution §
Develop and disseminate a campus-wide ‘Open Source Survey’ with the objectives of:
- Gauging usage of open source tools amongst members of the university community.
- Identifying open source projects under development.
- Connecting the university/OSPO to more relevant contacts.
- Collecting feedback on improving the open source environment at the institution.
- Understanding perceptions of open source on campus.
The survey captures a diverse range of respondents’ GitHub profiles, links to projects or other relevant OSS identifiers.
Resulting Context §
The survey data enables the university to track OSS projects that had previously been unknown outside of their own silos/networks.
More open source projects are revealed by respondents and potentially recognized by the university/OSPO.
Survey findings are used to inform and design services and resources based on users' needs.
The university is able to incorporate the results into messaging about the importance of supporting open source activity in a research environment.
Surveys are not perfect; many open source contributors will decline to participate and thus remain "invisible" to the OSPO. Surveys should be combined with additional discovery approaches.
Additional Learning from University of Wisconsin-Madison OSPO §
Our OSPO is hosted by the Data Science Institute. We capitalized on the priorities and skill sets of our colleagues there to assist in the discovery process. This led to the development of the campus-wide survey.
There were 300 responses to our survey. The results provided clear feedback on needs across campus. We shared results with user groups and stakeholders and used it as an opportunity to build relationships and promote the OSPO’s services.
We also recruited an outreach specialist to contact people identified through initial introductory calls and through the campus-wide OS survey.
Additional Learning from University of California Santa Barbara §
We conducted a survey across the UC system and ultimately received about 300 responses. Since we wanted to present the results in a scholarly format, we began by working with the Office of Research and the Institutional Review Board. Once we had IRB approval, we left the survey open for one month and advertised it as widely as possible. We maintain a "lessons learned" document with learnings particular to our survey instrument. (See References section below.) Virginia gave a lightning talk at the February 2026 CURIOSS Gathering in Dublin describing some of our lessons learned (you can find it on the CURIOSS YouTube channel), but here are the highlights (plus some additional thoughts): * Have a distribution plan. We received some (justified) criticism from reviewers that our sample size was too small. The truth is that we fought hard to get those 300 responses. On many campuses, our requests to forward the survey were routinely denied, and our paper flyers were removed. (If you use paper flyers, check campus regulations on allowed posting locations.) Try to get to know the listserv administrators, or anyone who can post to a listserv, before the survey period starts. Go to student group fairs and introduce yourself to the students at the booths. Go to events hosted by student groups and introduce yourself to the hosts. Go to research seminars and introduce yourself to the researchers. These relationships can be your way in to their listservs. Also, we had better success getting through to the listservs via undergraduate advisers as opposed to pure admin front office staff or subject librarians. Your department's communication staff may have other ideas. Individualized communications have better success. Budget a LOT of time for this. * Use both qualitative and quantitative questions. Each has their strengths and drawbacks. * We found three overarching themes in our data: culture, resources, and infrastructure. We believe these three themes could serve as a blueprint for creating a comprehensive survey. Make sure you ask questions about all three. * It is absolutely crucial to do a pre-test. We asked friends and colleagues from other universities to take it and give feedback, so we wouldn't have to dip into our own survey population. * If you're focusing on experienced open source contributors, you can perhaps afford to make the survey a bit long (use the pre-test feedback as a guide!). Many of these are people who donate hours of their time to projects because they care. They'll be stoked that the university wants to support them. If you're worried about survey fatigue, you can put in periodic optional "stopping points" like this study did https://peerj.com/articles/cs-963/, although that might make the data trickier to analyze. * When collecting personal information, follow proper IRB procedures, and have a data management plan even if you aren't seeking IRB approval. We asked people for their GitHub/Lab, etc. usernames and their emails, and they were free to decline to give them. Most (~60-70%) did indeed decline. We also let them choose whether they would like to be added to our Slack or mailing lists. Give them a choice and then respect their choice. * Reuse our survey instrument! Tweak it, gut it, let us know what you think! https://ucospo.net/oss-resources/survey/
Known Instances §
- UW-Madison Open Source Program Office, Data Science Institute, University of Wisconsin-Madison
- The GW Open Source Program Office, The George Washington University
- University of California OSPO Network
References §
- Community Survey Page - Information about the Open Source Survey at UW-Madison.
- Open Source Survey Results - Anonymised results of the UW-Madison Open Source Survey.
- Reproducing the Survey - The UW-Madison OSPO has open-sourced the survey and provides information on how to reproduce it.
- GW Open Source Survey - Information about the George Washington University’s Open Source Survey.
- UC 2025 Survey Website - All research artifacts from the University of California 2025 survey.
Related Patterns §
Contributor(s) & Acknowledgment §
In alphabetical order:
- Amber Budden https://orcid.org/0000-0003-2885-3980
- Clare Dillon https://orcid.org/0009-0008-6205-0296
- Ciara Flanagan https://orcid.org/0009-0005-3153-7673
- Allison Kittinger https://orcid.org/0000-0002-3104-5995
- Virginia Scarlett https://orcid.org/0000-0002-4156-2849
The UW-Madison Open Source Survey was inspired by a needs assessment survey conducted by the NYU Science and Software Services (DS3), an open source survey conducted by UW-Madison DSI Director Kyle Cranmer, which was produced at the University of Washington.