Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Previously Published Works bannerUCLA

Comparing the Recruitment of Research Participants With Chronic Low Back Pain Using Amazon Mechanical Turk With the Recruitment of Patients From Chiropractic Clinics: A Quasi-Experimental Study.

Abstract

Objective

The purpose of this study was to compare the crowdsourcing platform Amazon Mechanical Turk (MTurk) with in-person recruitment and web-based surveys as a method to (1) recruit study participants and (2) obtain low-cost data quickly from chiropractic patients with chronic low back pain in the United States.

Methods

In this 2-arm quasi-experimental study, we used in-person clinical sampling and web-based surveys from a separate study (RAND sample, n = 1677, data collected October 2016 to January 2017) compared with MTurk (n = 310, data collected November 2016) as a sampling and data collection tool. We gathered patient-reported health outcomes and other characteristics of adults with chronic low back pain receiving chiropractic care. Parametric and nonparametric tests were run. We assessed statistical and practical differences based on P values and effect sizes, respectively.

Results

Compared with the RAND sample, the MTurk sample was statistically significantly younger (mean age 35.4 years, SD 9.7 vs 48.9, SD 14.8), made less money (24% vs 17% reported less than $30,000 annual income), and reported worst mental health than the RAND sample. Other differences were that the MTurk sample had more men (37% vs 29%), fewer White patients (87% vs 92%), more Hispanic patients (9% vs 5%), fewer people with a college degree (59% vs 68%), and patients were more likely to be working full time (62% vs 58%). The MTurk sample was more likely to have chronic low back pain (78% vs 66%) that differed in pain frequency and duration. The MTurk sample had less disability and better global health scores. In terms of efficiency, the surveys cost $2.50 per participant in incentives for the MTurk sample. Survey development took 2 weeks and data collection took 1 month.

Conclusion

Our results suggest that there may be differences between crowdsourcing and a clinic-based sample. These differences range from small to medium on demographics and self-reported health. The low incentive costs and rapid data collection of MTurk makes it an economically viable method of collecting data from chiropractic patients with low back pain. Further research is needed to explore the utility of MTurk for recruiting clinical samples, such as comparisons to nationally representative samples.

Many UC-authored scholarly publications are freely available on this site because of the UC's open access policies. Let us know how this access is important for you.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View