Hiya, I’ve recently been working with the civic engagement and volunteer supplement in Massachusetts, including analyzing the harmonized variables that carry across the transition from the volunteer supplement through to the Civic Engagement and Volunteer Supplement.
One thing I’ve noticed is that statewide, the number of responses collapse during that transition, leading to some weird jumps in the share and number of people not volunteering between 2015 and 2017. For instance, in 2015, 1,212 respondents said they did not volunteer at any time during the previous year, for a weighted total of 4.2 million. But in 2017, about 729 respondents said they didn’t volunteer, for a weighted total of 3.2 million.
Knowing that the survey shifted modes across 2015-2017, I’m concerned these jumps and dips are an artifact of survey design, rather than any real change that happened. Especially since the question changes to clarify that infrequent child/school youth volunteering counts as volunteering. To that end, has anyone discovered a way to adjust for the change in question text to get a consistent universe?
Thanks,
Peter
As you note, there were a number of changes to the CPS Volunteer Supplement in 2017. These included new references to volunteering through an association, a change in universes (from those aged 15+ to 16+), as well as the replacement of some questions about volunteering with new broader questions about engagement with one’s community. A major change that is likely driving the decrease in the number of non-volunteers is the large increase in supplement non-interviews from 2015 to 2017:
-
In 2015, non-interviews occur if an eligible respondent provides a “refused” response to either of two questions: “Have you done any volunteer activities through or for an organization in the past year?” or “Sometimes people don’t think of activities they do infrequently or activities they do for children’s schools or youth organizations as volunteer activities. Have you done any of these types of volunteering activities in the past year?” 20,777 of the 104,534 eligible supplement respondents (~20%) refused one of these questions and classified as non-interviews based on this criteria.
-
In 2017, the definition of a non-interview was redefined to cases when an eligible respondent “refused” to answer the question, “How often did you talk to or spend time with friends and family?”. 37,888 of the 102,123 eligible supplement respondents (37%) refused this question and were assigned as non-interviews based on this criteria. The refusal rate has continued to hover around 40% in subsequent years.
While the volunteer supplement weight (VLSUPPWT) partially corrects for bias from undercoverage, biases may still be present when people who are missed by the survey differ from those interviewed in ways other than age, race, sex, Hispanic origin, and state of residence. It seems reasonable to me that willingness to respond to the interview may be correlated with having volunteered outside of these demographic factors. Abraham, Helms, and Presser (2009) argue that non-response was significantly biasing estimates from surveys in the period before this doubling in the refusal rate:
The authors argue that both the large variability in survey estimates of volunteering and the fact that survey estimates do not show the secular decline common to other social capital measures are caused by the greater propensity of those who do volunteer work to respond to surveys. Analyses of the American Time Use Survey (ATUS)— the sample for which is drawn from the Current Population Survey (CPS)—together with the CPS volunteering supplement show that CPS respondents who become ATUS respondents report much more volunteering in the CPS than those who become ATUS nonrespondents. This difference is replicated within subgroups. Consequently, conventional adjustments for nonresponse cannot correct the bias. Although nonresponse leads to estimates of volunteer activity that are too high, it generally does not affect inferences about the characteristics of volunteers.
I recommend consulting the current literature for methods that may help improve comparability between this break in the survey methodology.
Thank you Ivan, this overview is much appreciated.
1 Like