TLDR:
Problem
Designing technical interview questionnaires was a manual bottleneck that delayed hiring, forced customers to rely on FlairX admins, and hindered scalability.
Solution
An AI-powered, human-in-the-loop workspace that automates JD-to-Question generation while maintaining control through a "Review & Approve" interaction model.
7x Creation Efficiency
Reduced the end-to-end drafting process from 30 minutes of manual labor to under 3 minutes of AI-assisted review.
73% Increased Adoption
Successfully transitioned users from "Shadow Workflows" into our native platform by reducing friction.
CONTEXT
Moving from Internal Service to Customer Self-Service for Scaling
FlairX is a B2B SaaS platform for AI and expert-vetted technical interviews. Previously, Internal Admins manually drafted questionnaires, creating a massive operational bottleneck.
To scale, we shifted from a concierge service to an AI-powered framework, empowering customers to generate high-quality interview questionnaires themselves in minutes.
PROBLEM + GOAL
Two Hurdles to Scalability
Speed
Reduce questionnaire drafting from 30 minutes to instant generation, enabling the platform to scale from 3 to 100+ daily requests.
Adoption
Questionnaire creation ownership shifted from internal admins to customers. Adoption was hindered by a tedious post-creation process which needed drastic redesigning to accommodate self-service usability.
DISCOVERIES
Identifying Gaps Across Manual Creation, AI, and Competitive Tools
User Interviews | Contextual Inquiry | Hands-On Experience | Stakeholder & Customers Reviews | Persona Creation
DESIGNING FOR AI
Ideation War Room with the CEO, PMs, Designers, and Dev Team
Outcome 1: Advocating for Human-in-the-Loop Flow
Users should feel in control rather than ambushed by automation.
"One-Click Automation" was a trap that would lead to generic questionnaires. To bridge this, a "Human-in-the-Loop" flow allowed users to set parameters and accept, reject, or iterate on AI’s output.
IMPACT
Increased customer satisfaction by ensuring the AI hit the mark faster, reaching the final version with fewer revisions.
Outcome 2: Dropping Underused Features
I audited the manual flow to separate high-value assets from "dead weight" features that would hinder the AI experience.
AI INTERACTION
Choosing the Right Interaction Model
Automating Requirement Extraction
Isn’t this already in the JD. I don't have time to retype this.
To eliminate the friction of a 25-field setup, I replaced manual entry with an automated "Skill Extraction" step. By parsing the job description instantly, the AI identifies core requirements for a quick human review.
IMPACT
Increased adoption rates by removing manual data entry and reducing late-stage macro iterations by securing alignment early in the flow.
Shifted to Single Scroll Navigation to Adapt to New User Behavior
Users shifted from a "creation" mindset to a rapid "review" mode. While each skill being it’s own page matched a slow, one-by-one manual workflow, it created too much friction for AI outputs. Moving to a single-scroll workspace allowed users to audit the entire generated interview with minimal clicks.
IMPACT
Significantly reduced time on task due to fewer clicks and reduced context switching fatigue.
BUSINESS PIVOT
Adapting to a Last-Minute Shift to Self Service Model
As the business requirement moved from internal admin drafting to customer self-service, I adapted the interface to support a high-speed "Review and Approve" mindset.
1
Unified Interview Creation
Integrated question generation directly into the job posting flow to prevent task-fragmentation.
2
Job Description Parsing
Reduced the manual setup by parsing job description files for reduced redundant data entry
3
Asynchronous Progress
Added a "Skip to Add Candidates" option to let users proceed with logistics while the AI generates questions.
IMPACT
Reduced form fatigue (25 fields → 8 fields), increasing adoption and reducing waiting time during question generation.
FINAL DESIGN
End to End Flow in Action
Notable Design Nuances
AI REASONING TO BUILD TRUST

SKIP WHILE GENERATING
COMPLETED GENERATION NOTIFICATION

EDITABLE ANSWER GUIDE

QUESTION BANK

REFLECTION
What I Learned
Clarity beats visual cleanliness
Explicit controls outperform elegant but hidden interactions.
AI UX patterns don't always transfer
Mental models matter more than visual familiarity (our questionnaire ≠ a document)
Break complexity into steps
Sequential generation is more predictable than all-at-once.












