ePrivacy and GPDR Cookie Consent by Cookie Consent
The leading association
of public opinion and
survey research professionals
American Association for Public Opinion Research

How does the TI help the public evaluate and understand survey-based and other research findings?

Members of the Transparency Initiative are required to disclose a range of important details about how they conducted their research. Providing this critical information is important for assessing the rigor and appropriateness of the underlying methodology.

Members who are in good standing with the Transparency Initiative are reviewed every two years with checks done on how they are describing their methodology to the public, or their clients if the data is proprietary.
The following is a list of required elements that must be disclosed with all publicly and/or proprietary research products.

A. Items for Immediate Disclosure
  1. Data Collection Strategy: Describe the data collection strategies employed (e.g. surveys, focus groups, content analyses).
  2. Who Sponsored the Research and Who Conducted It. Name the sponsor of the research and the party(ies) who conducted it. If the original source of funding is different than the sponsor, this source will also be disclosed.
  3. Measurement Tools/Instruments. Measurement tools include questionnaires with survey questions and response options, show cards, vignettes, or scripts used to guide discussions or interviews. The exact wording and presentation of any measurement tool from which results are reported as well as any preceding contextual information that might reasonably be expected to influence responses to the reported results and instructions to respondents or interviewers should be included. Also included are scripts used to guide discussions and semi-structured interviews and any instructions to researchers, interviewers, moderators, and participants in the research. Content analyses and ethnographic research will provide the scheme or guide used to categorize the data; researchers will also disclose if no formal scheme was used.
  4. Population Under Study. Survey and public opinion research can be conducted with many different populations including, but not limited to, the general public, voters, people working in particular sectors, blog postings, news broadcasts, an elected official’s social media feed. Researchers will be specific about the decision rules used to define the population when describing the study population, including location, age, other social or demographic characteristics (e.g., persons who access the internet), time (e.g., immigrants entering the US between 2015 and 2019). Content analyses will also include the unit of analysis (e.g., news article,  social media post) and the source of the data (e.g., Twitter, Lexis-Nexis).
  5. Method Used to Generate and Recruit the Sample. The description of the methods of sampling includes the sample design and methods used to contact or recruit research participants or collect units of analysis (content analysis).
    1. Explicitly state whether the sample comes from a frame selected using a probability-based methodology (meaning selecting potential participants with a known non-zero probability from a known frame) or if the sample was selected using non-probability methods (potential participants from opt-in, volunteer, or other sources).
    2. Probability-based sample specification should include a description of the sampling frame(s), list(s), or method(s).
      1. If a frame, list, or panel is used, the description should include the name of the supplier of the sample or list and nature of the list (e.g., registered voters in the state of Texas in 2018, pre-recruited panel or pool).
      2. If a frame, list, or panel is used, the description should include the coverage of the population, including describing any segment of the target population that is not covered by the design.
    3. For surveys, focus groups, or other forms of interviews, provide a clear indication of the method(s) by which participants were contacted, selected, recruited, intercepted, or otherwise contacted or encountered, along with any eligibility requirements and/or oversampling.
    4. Describe any use of quotas.
    5. Include the geographic location of data collection activities for any in-person research.
    6. For content analysis, detail the criteria or decision rules used to include or exclude elements of content and any approaches used to sample content. If a census of the target population of content was used, that will be explicitly stated.
    7. Provide details of any strategies used to help gain cooperation (e.g., advance contact, letters and scripts, compensation or incentives, refusal conversion contacts) whether for participation in a survey, group, panel, or for participation in a particular research project. Describe any compensation/incentives provided to research subjects and the method of delivery (debit card, gift card, cash).
  6. Method(s) and Mode(s) of Data Collection. Include a description of all mode(s) used to contact participants or collect data or information (e.g., CATI, CAPI, ACASI, IVR, mail, Web for survey; paper and pencil, audio or video recording for qualitative research, etc.) and the language(s) offered or included. For qualitative research such as in-depth interviews and focus groups, also include length of interviews or the focus group session.
  7. Dates of Data Collection. Disclose the dates of data collection (e.g., data collection from January 15 through March 10 of 2019). If this is a content analysis, include the dates of the content analyzed (e.g., social media posts between January 1 and 10, 2019). 
  8. Sample Sizes (by sampling frame if more than one frame was used) and (if applicable) Discussion of the Precision of the Results.
    1.  Provide sample sizes for each mode of data collection (for surveys include sample sizes for each frame, list, or panel used)
    2. For probability sample surveys, report estimates of sampling error (often described as “the margin of error”) and  discuss whether or not the reported sampling error or statistical analyses have been adjusted for the design effect due to weighting, clustering, or other factors.
    3. Reports of non-probability sample surveys will only provide measures of precision if they are defined and accompanied by a detailed description of how the underlying model was specified, its assumptions validated, and the measure(s) calculated.
    4. If content was analyzed using human coders, report the number of coders, whether inter-coder reliability estimates were calculated for any variables, and the resulting estimates.
  9. How the Data Were Weighted. Describe how the weights were calculated, including the variables used and the sources of the weighting parameters.
  10. How the Data Were Processed and Procedures to Ensure Data Quality. Describe validity checks, where applicable, including but not limited to whether the researcher added attention checks, logic checks, or excluded respondents who straight-lined or completed the survey under a certain time constraint, any screening of content for evidence that it originated from bots or fabricated profiles, re-contacts to confirm that the interview occurred or to verify respondent’s identity or both, and measures to prevent respondents from completing the survey more than once. Any data imputation or other data exclusions or replacement will also be discussed. Researchers will provide information about whether any coding was done by software or human coders (or both); if automated coding was done, name the software and specify the parameters or decision rules that were used.
  11. A General Statement Acknowledging Limitations of the Design and Data Collection. All research has limitations and researchers will include a general statement acknowledging the unmeasured error associated with all forms of public opinion research.