In mid-July, Â鶹´«Ã½AV launched the Â鶹´«Ã½AV Center on Black Voices, which embodies our commitment to studying the lives of Black Americans. Although the center is new, Â鶹´«Ã½AV's surveys have been giving Americans a voice since the 1930s. For most of those years, we've relied on interviewers to collect people's opinions -- either by speaking with them in person in their home or by calling randomly selected households.
Â鶹´«Ã½AV will continue to collect data on historically trended questions using landline and mobile telephone surveys via the . Live interviewers are instrumental in preserving these historical trends. Interviewer-administered surveys can elicit different measurement than surveys asked via a self-administered mode (such as web and mail).
In addition to updating historical trends, we will be conducting in-depth studies on key topics such as justice, health and wellbeing, economic opportunity, jobs and the workplace, education, community, and the environment. We will conduct studies that dive deep into these issues using the Â鶹´«Ã½AV Panel (which we discuss in more detail later). One primary advantage of using the Â鶹´«Ã½AV Panel for these studies is that it enables us to efficiently reach a large number of people, including oversamples of Black Americans, in a timely manner.
We have already started using this method to reach Black Americans. The first of these Â鶹´«Ã½AV Panel studies was a survey on racial equality, conducted June 23-July 6, 2020, which garnered responses from more than 35,000 Panel members.
What Is the Â鶹´«Ã½AV Panel?
The Â鶹´«Ã½AV Panel is a probability-based, randomly selected panel of approximately 100,000 U.S. adults who have agreed to complete Â鶹´«Ã½AV surveys on a regular basis. It is one of the largest panels of its kind in the world. Members are recruited to join the panel via address-based sampling (ABS) and random-digit-dial (RDD) telephone interviews that cover landlines and cellphones. The Â鶹´«Ã½AV Panel is a multimode panel, and members can be reached via web (if they have internet access), mail, telephone and SMS (text message). Members may not opt in to the Panel.
Probability panels are the only existing source of randomly selected email addresses, which is important for achieving representative web-based surveys. Unlike opt-in panels, probability-based panels such as Â鶹´«Ã½AV's produce accurate estimates that are comparable to other probability-based data collection methodologies such as RDD1,2 and ABS.
In addition to producing high-quality findings, collecting data via the Â鶹´«Ã½AV Panel has several advantages. The email sample frame can be used to rapidly deploy web-based surveys, which are significantly less expensive than more traditional methods such as RDD telephone surveys. The Panel also gathers a wide range of demographic information about each member. This information can be used to efficiently sample low-incidence members of the population who would otherwise be challenging and costly to reach. In the case of the Â鶹´«Ã½AV Panel, members can also be reached via a variety of modes, giving researchers flexibility to select the most appropriate mode for each study or to implement mixed-mode designs.
Limitations of Panel Data Collection
As with all survey methodologies, panel data collection is not without limitations. Similar to the demographic profile of individuals reached via other modes and surveys (by Â鶹´«Ã½AV and other organizations), individuals responding to panel surveys who are younger, have lower education levels, have lower income levels, or identify with a racial or ethnic minority tend to participate at lower rates. Unique to panels, these groups also tend to have lower recruitment rates.3,4
These same groups that are less likely to join a panel are also less likely to respond to individual surveys and more likely to leave a panel. For example, Â鶹´«Ã½AV's study on racial equality achieved an overall participation rate of 42% -- but the participation rate of Black respondents, as well as individuals aged 18 to 44 and those with a high school education or less, was about 15 percentage points lower, on average.
Once members have joined, we carefully monitor who leaves the Â鶹´«Ã½AV Panel and is no longer eligible to complete surveys. This is known as panel attrition. Some demographic groups have higher attrition rates than others, including younger adults, Black or Asian individuals, and those with lower education levels.
Our Commitment to Ensuring Our Research Covers All Black Americans
With all studies, it is important to understand the limitations of the methodology and to remember that no method is perfect in successfully interviewing hard-to-reach subgroups in their actual population proportions. It is important for researchers to understand the potential bias being introduced into the findings and for particular subgroups, and to make adjustments to minimize this bias.
For the racial equality study referenced in this blog, as for all our studies, Â鶹´«Ã½AV applied post-stratification weighting adjustments to compensate for differential nonresponse by subgroup in the overall sample to make sure the weighted sample represented the overall U.S. population. But in this study, because it was so important for the results to be representative of the Black population, Â鶹´«Ã½AV also included adjustments for age, gender, education, income, race/ethnicity and region within the Black population. These adjustments ensured each of those subgroups in the weighted sample represented that subgroup in the U.S. population.
Â鶹´«Ã½AV also focuses much of its Panel recruitment and retention efforts on oversampling and engaging the groups we know tend to join the Panel at the lowest rates and attrite from the Panel at the highest rates.
Our Commitment to Helping Black Voices Be Heard
Â鶹´«Ã½AV is committed to continuing to use rigorous, representative methodologies to achieve high-quality, accurate results. To sustain our research for the Center on Black Voices, this includes a concerted effort to cover all Black Americans and to research potential methods for improving representation from subgroups that have historically been underrepresented. This research will include:
- understanding who is underrepresented in our research and exploring methods that best adjust for under-coverage or nonresponse bias
- understanding how differential incentives can help improve recruitment rates, response rates and attrition rates from groups that traditionally have lower-than-average participation
- experimenting with survey invitation and reminder messaging that helps build trust and motivate participation
- testing how survey design factors such as survey topic, question wording, survey length and visual survey design impact participation from different demographic groups
- exploring sampling methods that allow for targeted sampling of hard-to-reach demographic groups
For the past 85 years, Â鶹´«Ã½AV has been dedicated to conducting surveys that represent individuals with different backgrounds and different life experiences. Going forward with surveys and new innovative types of research, Â鶹´«Ã½AV will continue to ensure that all Black Americans have their voices heard in our research and are accurately represented.
Learn more about how the works.
Learn more about how the works.
Footnotes
[1] Yeager, D., Krosnick, J., Chang, L., Javitz, H., Levendusky, M., Simpser, A., & Wang, R. (2011). Comparing the accuracy of RDD telephone surveys and internet surveys conducted with probability and non-probability samples. Public Opinion Quarterly, 75(4), 709-747.
[2] MacInnis, B., Krosnick, J., Ho, A., & Cho, M. (2018). The accuracy of measurements with probability and nonprobability survey samples: Replication and extension. Public Opinion Quarterly, 82(4), 707-744.
[3] Hoogendoorn, A., & Daalmans, J. (2009). Nonresponse in the recruitment of an internet panel based on probability sampling. Survey Research Methods, 3(2), 59-72.
[4] Callegaro, M., Baker, R., Bethlehem, J., Göritz, A. S., Krosnick, J. & Lavrakas, P. (2014). Online panel research: A data quality perspective. Wiley.