Discussions
Recruiting Clinicians for Remote UX Research
Hello, Sprig team and fellow researchers I wanted to share a few hands-on lessons from running remote studies with busy healthcare professionals, and ask if others have similar tricks that work with your Sprig workflows.
Background: over the last year I ran three separate remote moderated and unmoderated studies with nurses and nurse educators (small sample sizes: 8–12 each). My aim was to test a clinical decision-support prototype and gather realistic task flows, not just opinions. I used Sprig for screener distribution and session follow-ups, and learned that small changes in wording and timing make huge differences in recruitment and data quality.
What helped me most
• Time-aware screening: schedule invitations around shift patterns. A short note like “15–20 mins, asynchronous, can be done between shifts” lifted completion rates. Nurses appreciated the flexibility.
• Credibility in the screener: mention institutional affiliation and a short sentence about IRB/ethics or data handling even when the study isn’t formal, this eases participation.
• Real incentives, delivered fast: I offered $25–$40 e-gift cards and sent them within 48 hours. Fast payouts build goodwill and encourage honest, thoughtful responses.
• Short, behavior-based tasks: replace abstract questions with “tell us about the last time you…” prompts. That produced richer, actionable answers than hypothetical prompts.
A small real-life example: in one study I changed a question from “Do you use clinical guidelines?” to “Describe the last guideline you consulted on a shift what made it easy or hard to use?” The length of responses tripled, and we discovered UI issues we wouldn’t have seen otherwise.
One subtle note several participants referenced their nursing literature review writing process when describing how they search for and synthesize guidelines. That gave us unexpected insight into how to structure in-app help and resources.
Question for the community: how are others using Sprig screeners to pre-qualify clinicians without creating friction? I am especially curious about automations (tags, follow-up logic) that keep recruitment efficient but still humane.
Thanks for reading happy to share my screener template (brief) if people want to adapt it.
