Recruiting Participants
Updated: May 8, 2026
The Recruit option in the invite page helps you create responses quickly. Today, the active recruiting path is Smart AI Agents, which generate simulated conversations that match your target audience description. The Human Panel option is visible in the UI but is marked coming soon and cannot be requested yet.
Use AI participants to test whether your conversation design works before you spend time or budget collecting real responses. Use real invitations and embeds when you need production research evidence.
When to Use AI Participants
AI participants are useful for:
- Testing whether the agent asks the right questions
- Checking whether the conversation feels too long, too generic, or too leading
- Exercising completion flows, form fields, and automations before launch
- Generating sample conversations for stakeholder review
- Stress-testing edge cases with specific attitudes or scenarios
AI participant conversations are simulated. They should not be counted as real participant evidence, customer validation, or production research data.
Generate AI Participant Conversations
- Open your conversation
- Select Invite participants in the header, or Next: Invite participants from Design after an outline exists
- Select Recruit
- Choose Smart AI Agents
- Pick the number of AI conversations to generate
- Open Advanced Options if you want to review the target demographics or add a custom prompt
- Click Generate AI Conversations
The common counts are 1, 3, 5, 10, and 20. Custom counts support 1 to 25 conversations.
Advanced Options
The Recruit flow can use the participant description from your outline as the target demographic context. You can also add a custom prompt with extra instructions, such as:
Use the custom prompt for the participant mindset, constraints, or scenario you want the simulation to exercise. Do not use it to inject facts that belong in the conversation's Knowledge.
How AI Conversations Appear
Generated conversations appear in results as simulated conversations. They are useful for review and testing, but they are excluded from participant group membership and real participant lists.
If you need real responses, use one of the production collection methods:
Human Panel Status
The Human Panel tab is currently a placeholder. It shows planned options such as target audience and participant count, but the request button is disabled. Use your own recruiting channels or participant groups for real participants until the panel option is available.
Best Practices
Run AI participants before launch. They are fast and can catch unclear instructions, missing context, and broken completion flows.
Use specific scenarios. "Generate five power users who disagree about pricing" is more useful than "generate five users."
Keep evidence separate. Treat simulated conversations as design QA. Do not mix them into claims about what real customers said.
Follow with real collection. Once the conversation performs well in simulation, collect real responses through links, invitations, embeds, or participant groups.
Common Pitfalls & Fixes
AI participants all sound similar → Add more specific demographic, role, or attitude instructions in Advanced Options.
The simulation misses important product facts → Add those facts to Knowledge, then generate again.
Stakeholders confuse simulated and real evidence → Label AI-generated conversations clearly in your research review and use real participant collection for final decisions.
Human panel cannot be requested → That option is not active yet. Use your own recruitment channels or saved participant groups.