In our continuing #software-outreach efforts, and after this past summer's great Summer of Code programs -- #soc-2017 -- I want to do a small survey to see if we can measure the effectiveness of some of our outreach and onboarding efforts.
(note that I'm also starting to collect info about these strategies on this page)
Roughly, we've gone from ~16 contributors in May 2016 to over a hundred today, a pretty dramatic change for about 18 months, especially after having only 16 contributors since our codebases were started in 2010. We have been seeking to invite and support a more diverse coding community as well, to develop a better site (this site -- PublicLab.org) with input from more perspectives.
But I know only enough about evaluation as a science or skillset to know that I don't know enough about it. I'm trying to:
- write questions that are not leading
- write check-all-that-apply and scale from 1-5 questions so we can quantify the results more easily
- allow space for open response answers
I need help getting a small survey together that we can send our contributors, and I'd especially like it if other free and open source projects used it so we could have some standardization of evaluation across projects, and better understand how we relate to efforts in other communities.
Questions
In particular, I'm interested in:
- effectiveness of our efforts to support and encourage new contributors
- satisfaction and feedback on these efforts from newcomers
- better understanding of the diversity of our new contributors, and in change over time
Sharing results
Also, should this be anonymous and should we publish the results in full or in part so we can compare with other orgs and learn as a community about our efforts?
Finally, I'd love it if other project seeking to do their own mini-eval projects could adopt a standard, ready-made survey, maybe add a question or two to the end, and not be stuck as I am on developing an eval from scratch.
That said, I'm VERY happy to simply adopt a survey that's out there, and add on some of our more specific questions.
Thanks for any help!
Survey draft
I've posted the questions here in a Google Doc as well -- please leave comments below here and offer edits on the doc!
## Introduction
We’re working hard to make our project more welcoming to newcomers, and to invite contributions from a more diverse community -- we see this as essential to improving the project and building on a wider range of perspectives and experiences.
This survey is anonymous but if you’d like to leave your contact information for any follow up,
## What we’ll do with the results
...
## Core questions
Was this the first open source project you contributed to?
How did you hear about this project? (check all that apply)
1. from a friend or acquaintance
2. on GitHub, GitLab, or another code sharing site
3. on Twitter, Facebook, or other social media
4. through an event
5. through a bootcamp, training, or class
6. on the project website
Why did you choose this project? (check all that apply)
1. friendly
2. easy
3. interested in the project goals
4. looking to build my own skills
5. ...
Do you plan to continue contributing to this project?
What were your motivations for contributing to this project? (multiple choice)
How would you rate the effectiveness of this project's efforts at welcoming new contributors?
## First-timers-only issues
Did you do a first-timers-only (FTO) issue?
Did you make any FTO issues? how many?
Did you feel the FTO issues you did better prepared you to contribute to Public Lab code? (1-5)
## Creating your own first-timers-only issues
Did did you feel that creating your own FTO issues for others supported your ability to contribute?
Did you feel creating your own FTO issues for others supported your ability to contribute?
Did you feel creating or completing FTO issues made you a better software contributor in general?
Thanks Jeff! There is also a need to coordinate this survey with the other surveys that are sent around Public Lab, such as other post-experience surveys like we distribute after events, and other annual community surveys such as the ones that are distributed to the entire community and to the organizers group. These surveys have all been designed to track progress against Public Lab's logic model, just about finalized with support from University of California Davis' Center for Community and Citizen Science. A software contributor survey will also need to line up with our logic model so that we can track if we are making progress toward our short, medium, and long term outcomes.
Reply to this comment...
Log in to comment
Great, thanks Liz - really helpful. The diversity section and the participation section (as you mentioned in chat a moment ago) would be super to include.
A copy of the Public Lab 2017 Community Survey is here: https://docs.google.com/forms/d/1zkKp13BouJ9H8M1hzN9M81jcHi3i8KNklFwvuoqsxxw/prefill
PDF version here: Public-Lab-Community-Survey-2017.pdf
Reply to this comment...
Log in to comment
OK, so I was able to work almost verbatim from the diversity and participation sections from the 2017 Community Survey -- i left a comment about adding 2-3 extra options on just one in Participation. Awesome.
For the "extra" questions specific to our first-timer outreach, I've narrowed it down to:
This may be a lot, and I think we could skip 8 & 9 to make the survey shorter and easier to complete.
Is this a question? Click here to post it to the Questions page.
Reply to this comment...
Log in to comment
Quick additional questions -- for ethnicity, was there an important reason the Community Survey doesn't allow "check all"? (I usually check more than one)
And does the Community Survey say anything about how data will be used, whether it's private or anonymous or where the results could be shared? I'm trying to craft a sentence about that for this.
Thanks!!!
Is this a question? Click here to post it to the Questions page.
Reply to this comment...
Log in to comment
Ah! That was an oversight, that there was no "check all that apply" -- do you think we should add to what's there or instead remove "multi-ethnic" in favor of this improvement?
The Community Survey did not say how the data would be used -- i believe that was an oversight. It was anonymous unless someone offered their contact information. Glad you will be including language around this and the All Community Survey will do so in the future.
Is this a question? Click here to post it to the Questions page.
Reply to this comment...
Log in to comment
OK -- we just sent it out to a group of about 8 people! @mollydb - did you happen to see this call?
Is this a question? Click here to post it to the Questions page.
Reply to this comment...
Log in to comment
We sent this version: https://docs.google.com/forms/d/1zkKp13BouJ9H8M1hzN9M81jcHi3i8KNklFwvuoqsxxw/prefill
Reply to this comment...
Log in to comment
We ultimately posted this survey, thanks to everyone!
https://github.com/publiclab/plots2/issues/1890
Reply to this comment...
Log in to comment
Here is a really great effort to come up with refined demographic survey language -- OpenDemographics: https://drnikki.github.io/sphinx-ghpages/
It's not complete -- there's a lot we can do to improve it and collaborate -- but I really like their open model!
Could be of interest @liz!
Reply to this comment...
Log in to comment
Just a note that Open Demographics, Mozilla/diversity, and others are doing great work on standardized questions...
Including this post on age/gender identity surveying:
https://channelcs.github.io/gender-identity.html#designing-forms-for-gender-identity
All this is incredible and should help enormously!
Reply to this comment...
Log in to comment
And linking this forward to the 2019 Public Lab Software Community Survey, which we ran for some months; here is the report:
https://publiclab.org/notes/liz/04-15-2019/report-2019-software-contributors-survey
And here are the questions we used, which came out of this discussion, in a non-functional template form (we can make you a copy of this form if you like):
https://docs.google.com/forms/d/105qPG_ojXEga85lLTiNEXcH0vtqFqJt10lJz0SyGOIo/edit
Reply to this comment...
Log in to comment