AAPOR
The leading association
of public opinion and
survey research professionals
American Association for Public Opinion Research

Short Courses

The 2017 Conference will include seven short courses to enhance your learning experience. These in-depth, half-day courses are taught by well-known experts in the survey research field and cover topics that affect our ever-changing industry. The 2017 Short Courses include:

Course 1: Dashboards for Active Survey Monitoring
Course 2: Sexual Orientation and Gender Identity (SOGI) Measurement in Surveys: History and Best Practices from Kinsey to CHIS and NHIS
Course 3: Mixed-Mode Surveys: An Overview of Estimation and Adjustment Methods and Empirical Applications
Course 4: An Introduction to Practical Text Analytics for Qualitative Research
Course 5: Visual Design for Single- and Mixed-Mode Surveys
Course 6: Into the Stream: An Introduction to Big Data Access for Survey Researchers and Social Scientists
Course 7: Designing Surveys to Combat Declining Response Rates and Increasing Data Collection Costs
 
Short Course Details
Course 1
Title: Dashboards for Active Survey Monitoring
Date: Wednesday, May 17, 2:30 p.m. – 6:00 p.m.
Course Overview:
What is a dashboard?  The term surfaced in business information systems in the 1990s and became popular in the last decade, but has made only occasional appearances in the survey research literature.  Dashboards can support clients, project directors, survey methodologists and managers with critical information for decision-making at a glance, on a single screen.  They can present alerts about unusual events that fall too far from the mean to be considered random noise.  They can serve as a portal for drilling down into survey data, paradata, and other data bases to investigate problems. In surveys that use adaptive design, they can inform users when it is prudent to change protocols.    In this short course we define business dashboards and discuss their advantages for monitoring key performance indicators in surveys.  We describe the basic kinds of dashboards (strategic, operational, performance), defined by different user groups and needs.  Visualization is a critical component. Examples illustrate design principles and pitfalls.  The core content of the course is an introduction to dashboard design and data visualization principles, and techniques for applying them in the context of web, telephone, mail and face-to-face surveys.
Instructor:
Brad Edwards is a vice president at Westat in Rockville, Maryland, with more than 30 years of experience designing and managing large, complex surveys of households and establishments.  He co-teaches a short course on survey management for the Joint Program in Survey Research and has co-edited books on multicultural issues in surveys, hard-to-survey populations, and (in press) total survey error.
Course Objectives:
• To develop a basic understanding of dashboards and how they can support survey management.
• To acquire beginner-level knowledge of dashbboard design in the survey context.
Who Should Attend:
Survey designers, managers, process quality professionals and methodologists from both the client and the supplier sides of the research world; students who are interested in conducting surveys; and IT professionals seeking ways to provide effective survey management support.

Return to Top
 
Course 2
Title: Sexual Orientation and Gender Identity (SOGI) Measurement in Surveys: History and Best Practices from Kinsey to CHIS and NHIS
Date: Wednesday, May 17, 2:30 p.m. – 6:00 p.m.
Course Overview:
Sexual orientation and gender identity (SOGI) have been studied for decades, but have only recently been included in large-scale, general-population surveys and polls. This course traces the history of SOGI measurement from early studies (e.g., Kinsey), to probability-based surveys like the General Social Survey (GSS), the California Health Interview Survey (CHIS), the Behavioral Risk Factor Surveillance System (BRFSS), and the National Health Interview Survey (NHIS). This historical perspective is bolstered by quantitative literature on SOGI questions, including pretesting results. Drawing on published best practices, several methods of asking SOGI are addressed, with commentary on their resulting prevalence rates across surveys. These are discussed in the context of current efforts within the US Federal Statistical System to promote SOGI measurement in Federal surveys broadly (e.g., the OMB Federal Interagency Working Group on SOGI Measurement). Recommendations for including SOGI questions in surveys of various modes are discussed, highlighting successes from CHIS, NHIS, and other large-scale interview-based surveys. This course will benefit anyone working with or wanting to work with SOGI data, and survey researchers tasked with adding SOGI questions to their surveys. Open questions in the study of SOGI and the future of SOGI measurement in surveys will be discussed as well.
Instructor:
Matt Jans, PhD, is Survey Methodologist for the California Health Interview Survey (CHIS) at UCLA. Jans is nationally-known for CHIS’s gender identity measurement, a collaboration with the Williams Institute at UCLA, and his sexual orientation nonresponse research published in the American Journal of Public Health. He is adjunct faculty at UConn’s Graduate Program in Survey Research, JPSM, and the International Program in Survey and Data Sciences. His PhD is from the Michigan Program in Survey Methodology.
Course Objectives:
• A clear understanding of best practices for asking SOGI in surveys and how to ask SOGI measures in their survey, whatever mode.
• An understanding of the historical context of SOGI measurement broadly, and specifically within surveys.
• Perspective on how SO and GI rates vary across surveys.
Who Should Attend:
This course will benefit anyone working with or wanting to work with SOGI data for "substantive" research, and survey researchers tasked with adding SOGI questions to their surveys. Thus academic researchers who study LGBT issues, pollsters, policy experts, and methodologists, and survey managers should all find something helpful in this course. No pre-requisites necessary.

Return to Top
 
Course 3
Title: Mixed-Mode Surveys: An Overview of Estimation and Adjustment Methods and Empirical Applications
Date: Wednesday, May 17, 2:30 p.m. – 6:00 p.m.
Course Overview:
Although data collection mode decision has always been one of the key components in survey designs, recently survey researchers face a greater complexity in data collection mode decisions. This increasing complexity is a result of the technological developments and the better understanding of how mode affects measurement error in particular. Briefly, mixed-mode surveys use a combination of data collection methods to increase coverage, response rates and data quality. Mixed-mode survey design process involves dynamic survey error trade-off discussions which simultaneously rely on empirical findings, practical knowledge and theory. As a result, there is an extra burden on the survey researcher to be aware of the specific gaps and the assumptions that are made in specific designs and what the implications of these assumptions are for the survey inference. Class will cover specific common designs, motivations behind these common designs including the data analysis methods specifically in the presence of selection effects.
Instructor:
Dr. Tuba Suzer Gurtekin is a research fellow in survey methodology at the University of Michigan. Her research and practice on mixed-mode surveys include design and analysis of mixed-mode customer satisfaction studies, design and analysis of mixed-mode survey experiments in an ongoing monthly general population telephone survey and evaluation of mixed-mode survey inference methods. She taught classes on mixed mode surveys, fundamentals of survey methodology, and randomized and nonrandomized design. Her current research focuses on mixed-mode survey inference methods, response style adjustment methods and respondent driven sampling data analysis.
Course Objectives:
• Students will have the basic knowledge and the understanding of practical needs and constraints that are behind various mixed-mode surveys.
• Students will have the basic knowledge and the understanding of theories and principles that govern the specific mixed-mode survey data analysis.
Who Should Attend:
Those who plan to conduct mixed-mode surveys and would like to discuss more about the implications of the design features on the inference. Also, those who plan to analyze mixed-mode survey data from secondary sources and would like to learn more about the analysis methods.

Return to Top
 
Course 4
Title: An Introduction to Practical Text Analytics for Qualitative Research
Date: Thursday, May 18, 8:00 a.m. – 11:30 a.m.
Course Overview:
Text analysis has become increasingly more popular as practitioners look for ways to sort, categorize, compare and distill meaning from unstructured text data. These data include, for example, transcripts and notes from focus groups, in-depth interviews, speeches or ethnographies, open-ended survey questions, and social media posts, tweets, or blogs. We will cover the current state of text analysis for qualitative research, including methods or basic text summaries and analyses, document categorization and corpus comparison, as well as text annotation and sentiment analysis. We will also discuss current directions in text analytics for qualitative researchers, including the movement toward natural language processing and topic modeling, which takes text analysis from sorting, counting and categorization to thematic analysis of data. We will talk about some of our own work, both in the examination of the text analytic process and in natural language processing and topic modeling. We will also demonstrate a practitioner-friendly tool we are developing to address some of the key pain points in qualitative data analytics.
Instructors:
Andrew Stavisky is a senior design methodologist at the GAO where he is one of the leaders of GAO’s qualitative research group.  Andrew has nearly 20 years of experience as a senior analyst, senior methodologist and qualitative researcher for a handful of top market research, public affairs and polling organizations. Andrew is also an adjunct professor at George Washington University, School of Media and Public Affairs. Andrew’s applied research is looking into ways to create efficiencies in the qualitative analysis and unstructured text analysis process. Andrew received his PhD from the University of Maryland at College School of Public Health.    
Philip Resnik is Professor of Linguistics at the University of Maryland, with a joint appointment at the University of Maryland Institute for Advanced Computer Studies and an affiliate appointment in Computer Science. He received his bachelor's degree in Computer Science at Harvard in 1987, and his Ph.D. in Computer and Information Science at University of Pennsylvania in 1993, and joined the University of Maryland faculty in 1996. His industry experience prior to entering academia includes time in R&D at Bolt Beranek and Newman, IBM T.J. Watson Research Center, and Sun Microsystems Laboratories.
Course Objectives:
• Basic understanding of the current state of qualitative analysis.
• Basic understanding of future developments in text analytics.
Who Should Attend:
Anyone who needs to analyze unstructured text data from focus group or IDI transcripts, open-ended survey questions, social media tweets/posts, or any other text data. No pre-requisites.

Return to Top
 
Course 5
Title: Visual Design for Single- and Mixed-Mode Surveys
Date: Thursday, May 18, 8:00 a.m. – 11:30 a.m.
Course Overview:
This short course will focus on how to achieve more effective and functional survey designs and layouts.  The focus will be primarily on mail and web surveys, but some attention will be given to visual design for interviewers in in-person and telephone surveys.  The course will also cover visual design for mixed-mode surveys (i.e., how to achieve unified mixed-mode designs).  The course will provide an overview of the mechanics of visual processing as well as key concepts from the vision sciences that can help surveyors think through how to accomplish their goals with visual design.  Throughout the course, examples of how the visual design concepts can be applied to a questionnaire to make visual processing more efficient and effective will be given.  In addition, empirical evidence of the effectiveness of visual design elements will be provided.  The examples will cover visual design issues at both the individual question level and at the level of whole pages or screens.
Instructor:
Jolene Smyth is an Associate Professor in the Department of Sociology and the Director of the Bureau of Sociological Research at the University of Nebraska-Lincoln where she teaches graduate level courses on data collection methods and questionnaire design.  Her research broadly focuses on survey measurement and nonresponse.  She is co-author with Don Dillman and Leah Christian of “Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method” (2014). Her current projects focus on visual design of questionnaires, mixed-mode surveys, question wording, the design of within-household selection techniques in self-administered surveys, CATI questionnaire design, and interviewer/respondent interactions.
Course Objectives:
• Demonstrate the importance of visual design for survey measurement.
• Introduce participants to key visual design concepts and their application to survey research.
• Provide an overview of survey visual design research findings.
Who Should Attend:
Researchers of all levels who design and conduct self-administered visual survey modes as well as those designing interviewer-administered surveys.   It will also be useful to those designing survey software, particularly web and mobile-web survey platforms, but also CATI software and to students or researchers interested in conducting visual design research.

Return to Top
 
Course 6
Title: Into the Stream: An Introduction to Big Data Access for Survey Researchers and Social Scientists
Date: Thursday, May 18, 8:30 a.m. – 11:30 a.m.
Course Overview:
Many researchers portended that with the rise of Big Data, the need for survey-based data collection might wane or become obsolete. While Big Data can provide many insights, it often cannot answer the “why” question.  Such insights then, in our opinion, are still well suited for survey research methods. However, with the age of rising costs, lower response and harder to reach populations of interest, we entertain the question of what help Big Data can provide survey researchers in order to improve survey questions, survey designs and analyses. Starting at the source, this short course takes a step back from data science/machine learning heavy courses to first ask “how can I collect the Big Data that I need to measure public opinion?” In particular, we highlight two popular approaches to Big Data collection and discuss their benefits and limitations. First, web scraping offers methods for collecting data from both structured and unstructured web pages. Second, data APIs offer portals for gathering (semi-) structured data as it is generated (or queried) by Big Data sources We will illustrate both approaches with real-world examples and demonstrate their usage through various examples. Where applicable, R code will be provided to participants.
Instructors:
Adam Eck, Ph.D. is an Assistant Professor of Computer Science at Oberlin College. Adam’s teaching and research interests include intelligent agents, machine learning, data science (including Survey Informatics) and their application to real-world problems.
Trent Buskirk, Ph.D. is a Professor of Data Science and Survey Methods and the Director of the Center for Survey Research at the University of Massachusetts Boston. Trent's research involves the use of machine learning for survey design and analysis. Trent also investigates the use of modern technology for survey data collection. He has taught several short courses and is currently the AAPOR Associate Conference chair.
Course Objectives:
• Inform participants about how Big Data sources on the web provide data along with a discussion of possible error properties and/or limitations for these sources.
• Demonstrate basic principles of web scraping.
• Expose participants to APIs for semi-structured data on the web.
Who Should Attend:
Anyone interested in the potential collaboration of Big Data and survey research is invited to attend.  We especially invite those interested in learning more about what types of data Big Data can provide to survey researchers, as well as how those data might be acquired.  The course is aimed at being informative, practical and full of examples!

Return to Top
 
Course 7
Title: Designing Surveys to Combat Declining Response Rates and Increasing Data Collection Costs
Date: Sunday, May 21, 8:00 a.m. – 11:30 a.m.
Course Overview:
There are a number of issues facing surveys today, but two preeminent challenges that have had profound effects are declining survey participation and increasing survey costs. The threat to probability-based survey inference has never been greater. Simplistic solutions such as allowing lower response rates and reducing sample sizes can threaten the precision and accuracy of survey estimates. As a result, there is increased need for more complex survey designs that protect the integrity of the survey estimates.
This course aims to provide background and practical tools to address declining response rates and the resulting risk of nonresponse bias through survey design. Multi-phase, multi-protocol study designs are discussed, along with two-stage sampling for nonresponse. The use of statistical models during data collection for nonresponse bias reduction and models for cost reduction are introduced. Responsive and adaptive survey designs are introduced, in the special case of addressing nonresponse and cost.
Examples are presented from telephone, in-person, and multi-mode surveys. The examples are used to illustrate alternative approaches, as well as design decisions based on the relative importance of multiple objectives in a survey (e.g., bias reduction vs. variance minimization).
Instructor:
Andy Peytchev is a Research Assistant Professor at the Institute for Social Research, University of Michigan. Prior to the ISR, he was a Senior Survey Methodologist in the Program for Research in Survey Methodology at RTI International. He has led grant-funded research on methods to reduce nonresponse bias through data collection design and postsurvey adjustments, and has led sampling, responsive design, and weighting on national surveys. He has published research in this area and has taught courses on related topics. He received his PhD in Survey Methodology from the University of Michigan.
Course Objectives:
• Introduce the need for alternative survey designs that address nonresponse and cost challenges, tailored to a particular study’s objectives.
• Provide a set of tools that could be used in designs to address the threat of nonresponse and could help reduce costs.
Who Should Attend:
Survey methodologists and practitioners who are involved in study design and data collection management. Statistical background is helpful but not required for the course.

Return to Top


Platinum Sponsors

ABT Associates
GfK
Headway in Research
ICF
IMPAQ
Nielsen
NORC
RTI
SSRS
Westat

Gold Sponsors

AIR
AVP Social Science Research
Dial800/Reconnect Research
Mathmatica
MDRC

Silver Sponsors

MJT
Oxford University Press
UCONN

Bronze Sponsors

D3
EdChoice
Survox

Conference Supporters

SSI

Exhibitors

3Q Global
AANP
ABT Associates
Adapt, Inc.
ASDE Survey Sampler
AVP Social Science Research
cApStAn LQC Inc
Dial800/Reconnect Research
Gallup
GfK
Gravic Inc Remark Software
Headway in Research
ICF
ICPSR
IMPAQ
Issues & Answers Network
Langer Research
Mathmatica
MDRC
mFour
Michigan Program in Survey Methodology
MJT
Nielsen
NORC
Opinion Access
Oxford University Press
Provalis Research
Rand
Reconnaissance Market Research
Revily Inc
Roper
RTI
SSI
SSRS
Stampede Consulting
Stata Corp LP
STS
Swift Prepaid Solutions
UCONN
USDA
Voxco
Westat