25 March 2011 | Tara S. Behrend · David J. Sharek · Adam W. Meade · Eric N. Wiebe
Crowdsourcing has emerged as a viable alternative to traditional university participant pools for collecting survey data in behavioral research. This study compared crowdsourced data with data from a university sample, finding that crowdsourced respondents were older, more ethnically diverse, and had more work experience. The reliability of crowdsourced data was as good as or better than the university sample, and measurement invariance generally held across groups. Despite small differences in personality and socially desirable responding, crowdsourcing is efficient and appropriate for organizational psychology research. The study outlines the risks and advantages of crowdsourcing and provides practical and ethical guidelines.
Crowdsourcing involves recruiting an online, independent global workforce for specific tasks. Mechanical Turk, a well-known crowdsourcing platform, allows researchers to recruit participants for studies. The study used Mechanical Turk to recruit 270 adults, who were paid $0.80 per participation, and compared them with 270 undergraduate students from a university. The crowdsourced sample was more diverse, older, and had more relevant work experience. Data quality was comparable between the two samples, with slightly higher social desirability in the crowdsourced sample but better reliability in the university sample. The study also found that the psychometric properties of commonly used organizational research surveys were similar across samples, with only minor differential item functioning.
The study addressed five research questions: demographic characteristics of crowdsourced participants, data quality, psychometric properties, personality and attitude differences, and motivations for participation. The crowdsourced sample showed slight differences in demographics and personality traits but similar data quality. Participants in the crowdsourced sample were primarily motivated by financial incentives, while university students were motivated by course credit. The study concluded that crowdsourcing is a viable alternative to university participant pools, offering more diverse and representative samples for organizational research. However, ethical considerations and potential biases in data quality must be addressed. The study also highlighted the importance of compensation levels and the need for further research on the effects of varying compensation on data quality.Crowdsourcing has emerged as a viable alternative to traditional university participant pools for collecting survey data in behavioral research. This study compared crowdsourced data with data from a university sample, finding that crowdsourced respondents were older, more ethnically diverse, and had more work experience. The reliability of crowdsourced data was as good as or better than the university sample, and measurement invariance generally held across groups. Despite small differences in personality and socially desirable responding, crowdsourcing is efficient and appropriate for organizational psychology research. The study outlines the risks and advantages of crowdsourcing and provides practical and ethical guidelines.
Crowdsourcing involves recruiting an online, independent global workforce for specific tasks. Mechanical Turk, a well-known crowdsourcing platform, allows researchers to recruit participants for studies. The study used Mechanical Turk to recruit 270 adults, who were paid $0.80 per participation, and compared them with 270 undergraduate students from a university. The crowdsourced sample was more diverse, older, and had more relevant work experience. Data quality was comparable between the two samples, with slightly higher social desirability in the crowdsourced sample but better reliability in the university sample. The study also found that the psychometric properties of commonly used organizational research surveys were similar across samples, with only minor differential item functioning.
The study addressed five research questions: demographic characteristics of crowdsourced participants, data quality, psychometric properties, personality and attitude differences, and motivations for participation. The crowdsourced sample showed slight differences in demographics and personality traits but similar data quality. Participants in the crowdsourced sample were primarily motivated by financial incentives, while university students were motivated by course credit. The study concluded that crowdsourcing is a viable alternative to university participant pools, offering more diverse and representative samples for organizational research. However, ethical considerations and potential biases in data quality must be addressed. The study also highlighted the importance of compensation levels and the need for further research on the effects of varying compensation on data quality.