25 March 2011 | Tara S. Behrend · David J. Sharek · Adam W. Meade · Eric N. Wiebe
The article explores the viability of using crowdsourcing, particularly Amazon's Mechanical Turk, as an alternative to traditional university participant pools for survey research in organizational psychology. The study compares the demographic characteristics, data quality, and psychometric properties of responses from crowdsourced and university samples. Key findings include:
1. **Demographic Differences**: Crowdsourced respondents were older, more ethnically diverse, and had more work experience compared to university participants.
2. **Data Quality**: While crowdsourced data showed slightly higher social desirability, it also had better reliability and consistency, indicating that the quality of data from both sources is comparable.
3. **Psychometric Properties**: Most scales functioned equivalently across samples, with only a few items showing differential functioning, particularly in the areas of openness and conscientiousness.
4. **Motivations for Participation**: Financial incentives were the primary motivation for using Mechanical Turk, though educational and entertainment benefits were also noted.
The study concludes that crowdsourcing is a viable and efficient alternative to traditional university participant pools, offering a more diverse and representative sample, especially for organizational research. However, it highlights the need for ethical considerations, such as ensuring informed consent and addressing potential social desirability bias.The article explores the viability of using crowdsourcing, particularly Amazon's Mechanical Turk, as an alternative to traditional university participant pools for survey research in organizational psychology. The study compares the demographic characteristics, data quality, and psychometric properties of responses from crowdsourced and university samples. Key findings include:
1. **Demographic Differences**: Crowdsourced respondents were older, more ethnically diverse, and had more work experience compared to university participants.
2. **Data Quality**: While crowdsourced data showed slightly higher social desirability, it also had better reliability and consistency, indicating that the quality of data from both sources is comparable.
3. **Psychometric Properties**: Most scales functioned equivalently across samples, with only a few items showing differential functioning, particularly in the areas of openness and conscientiousness.
4. **Motivations for Participation**: Financial incentives were the primary motivation for using Mechanical Turk, though educational and entertainment benefits were also noted.
The study concludes that crowdsourcing is a viable and efficient alternative to traditional university participant pools, offering a more diverse and representative sample, especially for organizational research. However, it highlights the need for ethical considerations, such as ensuring informed consent and addressing potential social desirability bias.