Program at a glance
Ramon Trias Fargas, 25-27
Conference Registration and the Conference Opening/Keynotes are located in the underground space between the Jaume I building (Building 20) and the Roger de Llúria building (Building 40). There will be signage and volunteers to direct you.
|Download the BigSurv18 App|
All sponsor exhibits will be in Room 30.S02 S. Expo on Friday 26th and Saturday 27th
Socializing with Surveys: Combining Big Data and Survey Data to Measure Public Opinion
|Chair||Dr Pascal Siegers (GESIS)|
|Time||Saturday 27th October, 09:00 - 10:30|
Social Media as an Alternative to Surveys of Opinions About the Economy
In review process for the special issue
Dr Frederick Conrad (University of Michigan) - Presenting Author
Dr Johann Gagnon-Barsch (University of Michigan)
Ms Robyn Ferg (University of Michigan)
Ms Elizabeth Hou (University of Michigan)
Dr Josh Pasek (University of Michigan)
Dr Michael Schober (The New School)
Sample surveys have been at the heart of the social research paradigm for many decades. Recently, there has been considerable enthusiasm among researchers about new types of data, created mostly online, that may be timelier and less expensive than traditional survey data. One example is social media content, which researchers have begun to explore as a possible supplement or even substitute for survey data across a range of domains. A good example is the study by O’Connor et al. (2010) which reports reasonably high correlations between the sentiment of tweets containing the word “jobs” and two survey-based measures: Gallup’s Economic Confidence Index (r = .79) and the University of Michigan’s Index of Consumer Sentiment (ICS) (r = .64).
A number of other studies have compared social media content to survey data (and related measures) and found some correspondence (e.g., Antenucci et al., 2014; Jensen & Anstead, 2013; Tumasjan et al., 2010). But not all of these initial success stories have held up. For example, analyzing data through 2011, we (Conrad et al., 2015) replicated – in fact strengthened – the relationship between sentiment of “jobs” tweets and the Michigan ICS reported by O’Connor et al. But when we included data collected after 2011, the relationship degraded rapidly, becoming small and negative. Similarly, Antenucci et al. (2014) accurately predicted US unemployment, measured by initial claims for unemployment insurance, based on the frequency of tweets containing words and phrases such as “fired,” “axed,” “canned,” “downsized,” “pink slip,” “lost job,” “unemployed,” and “seek job.” However, starting in mid 2014 the predictions and actual claims began to diverge and have not returned to previous levels of agreement. This pattern of relatively strong relationships in early years followed by highly attenuated relationships in more recent years raises serious questions about the viability of using social media content in place of survey data.
The proposed presentation investigates the origins of the on-again-off-again relationship between sentiment of “jobs” tweets and the ICS by manipulating several analytic attributes and tracking the impact on the association between the two. A critical attribute not yet addressed in the literature is how the tweets are preprocessed and what categories they are sorted into (if any). We trained a classifier to assign the tweets to “job ads,” “personal jobs,” “junk,” and “other;” junk tweets actually correlated more highly with the ICS than did the other categories, suggesting that the early correlation may well have been spurious. We also manipulated the smoothing and lagging intervals (in days), the sentiment dictionary used, how sentiment was calculated (e.g., # pos/# neg words), and the particular survey questions contributing data (as opposed to the global ICS). The values of these analytic attributes can substantially affect the association between the two data sources, but the association is never high in absolute terms, increasing our skepticism that tweets (at least these) can be easily substituted for survey data. We close by proposing preliminary best practices for analyzing Twitter content and comparing it to survey
Measuring the Strength of Attitudes in Social Media Data
Final candidate for the monograph
Dr Ashley Amaya (RTI International) - Presenting Author
Dr Ruben Bach (University of Mannheim)
Dr Frauke Kreuter (University of Maryland; University of Mannheim)
Dr Florian Keusch (University of Mannheim)
Scales are frequently used in public opinion research to determine the strength of individuals’ attitudes or beliefs. Individuals who fall on the extreme ends of the spectrum are unlikely to be swayed by politicians or affected by a social intervention, whereas individuals who feel less strongly may be more likely to change their opinion. It is therefore important to accurately measure attitude strength and understand the potential for measurement error on such scales.
Much research has been done to assess measurement error in attitude questions. For example, acquiescence bias is common among agree/disagree scale questions. Respondents may not want to create conflict or may satisfice, making them more likely to choose “agree,” regardless of their true feelings and regardless of the question being asked.
In the era of social media, researchers are beginning to use social media to measure social attitudes. While previous research has documented the risk of bias in the dichotomous estimates of public opinion, research has not been conducted to determine whether social media data can be used to measure the strength of an attitude and whether it suffers from bias similar to survey data. In this presentation, we answer two research questions:
• Do non-traditional data produce attitude distributions that are similar to those from survey data?
• If not, is the difference attributable to missingness (i.e., undercoverage and nonresponse), or is it a function of measurement error?
To answer these questions, we used data on a range of social issues collected by Germany’s European Social Survey, the German Internet Panel, Reddit posts, and Reddit SampleSize – two probability-based surveys, social media posts, and a non-probability survey. We compared the distribution of responses for each topic across the four data sources overall and by subdomains. To isolate measurement error from missingness, we limited analysis to individuals who had completed the Reddit SampleSize survey and compared the distribution of attitudes among this subset’s Reddit posts to their survey responses.
Protest Within an Authoritarian Context: Perception of Opportunities and Support for Protest Among Citizens in the Arab World
Mr Mohamed Elsayed Yousef (Universitat de Barcelona) - Presenting Author
Dr Camilo Cristancho (Universitat de Barcelona)
To what extent does an individuals’ perception of political opportunity structure affect her attitudes towards protest? Theorists of contentious politics conclude that protest under authoritarian control is costly, dangerous and unexpected, yet people in authoritarian regimes still march to the streets in huge numbers and call for change. In recent years, the Arab world witnessed a wave of demonstrations that swept almost every country in the region. Yet only few studies have addressed citizens’ support for protest. Focusing on protest events that took place in Tunisia from December 18 till the 14th of January 2011, and #jan25 protest in Egypt between January 2011 till February of the same year, we explore citizens’ attitudes toward protests and compare how they vary before, during and after major events. We rely on repeated cross-sectional nationally representative data from the Arab barometer (N=1,200) and Gallup polls for two countries (N=1,000) which measure multiple attitudes towards protest and compare it with evidence from digital traces of protest events on Twitter, which are tracked through hashtags associated to #sidibouzid, #tunisia, #jan25 and # egypt of the most salient events. We find that perceptions of threat, inefficacy and non-responsive elites are associated with lower levels of support for protest, but also with anger and unease for democratic concerns. Most importantly, our analysis suggests a broad variation on support for protest depending on issue position. Our results shed light on the importance of assessing individuals’ perception of protests and considering the heterogeneity within demonstrators and followers of contentious politics online.