University of Sussex
Browse
Piehlmaier.No_comments.Bots_comments_revised.pdf (428.75 kB)

Bot detection in online studies and experiments

Download (428.75 kB)
journal contribution
posted on 2023-06-10, 01:14 authored by Dominik PiehlmaierDominik Piehlmaier
Most experimental and online studies in the empirical social sciences rely on online panels from crowdsourcing platforms, such as Amazon Mechanical Turk (MTurk), Prolific, Qualtrics Online Panel, and their lesser known competitors. The key benefit of all of these services is an easy and affordable access to a large pool of diverse participants, a privilege that was previously reserved for globally leading and financially independent universities. However, this newly achieved leveled playing field comes at a cost. Semi- or fully automated response tools, also called bots, decrease data quality and reliability. This case describes how two online studies were conducted on a crowdsourcing platform in anticipation of bot responses. Specifically, the case offers insights into the study design process, the selection of appropriate survey questions and bot traps, as well as the ex-post analysis and filtering of bot responses. Best practices are identified, and potential pitfalls explained. The description should aid readers in designing anticipatory online studies and experiments to increase their data quality, validity, and reliability.

History

Publication status

  • Published

File Version

  • Accepted version

Journal

SAGE Research Methods

Publisher

SAGE

Department affiliated with

  • Strategy and Marketing Publications

Notes

Online ISBN: 9781529601312

Full text available

  • Yes

Peer reviewed?

  • Yes

Legacy Posted Date

2021-10-01

First Open Access (FOA) Date

2021-10-07

First Compliant Deposit (FCD) Date

2021-10-01

Usage metrics

    University of Sussex (Publications)

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC