Cereba - AI Powered Onboarding Setup
Scalable AI onboarding built for clarity, speed, and user trust.
What is Cereba?
Cereba is a SaaS platform enabling businesses and agencies to craft AI-driven sales and marketing campaigns. The onboarding process serves as the initial step in training the AI to align with the brand's tone, objectives, and target audience.
Role: Head of Product & UX
Company: Early-stage AI SaaS startup
Timeline: MVP launch to continuous iterations.
Challenge
Establishing user trust in AI-generated content from the beginning.
Designing a swift yet customizable setup process.
Accommodating varying user knowledge levels and content quality.
Creating distinct, clear entry paths for different user types.
My Contribution
Led the strategy, design, and implementation of the AI onboarding experience.
Defined the onboarding vision and created the end-to-end user flow based on behavioral data, product goals, and trust-building best practices.
Partnered cross-functionally with engineering to build scalable logic that adapted to user type, campaign complexity, and business size.
Designed and tested multiple onboarding paths, translating user insights and funnel analytics into actionable improvements that increased completion rates and reduced confusion.
Goal
Reduce onboarding friction and uncertainty.
Enable users to establish a personalized AI setup in under five minutes.
Cater to both individual and agency users with tailored experiences.

The Design Process
1 Understand
Initial Approach: Launched with a comprehensive onboarding experience, asking over 25 questions about the user's company, industry, tone, and goals to equip the AI effectively.
What We Saw
Significant drop-off at the AI personality step, as tracked via Google Looker Studio.
User feedback indicated reluctance to repeat the extensive setup for each campaign.
Increased volume of onboarding support requests in the early stages.
What We Measured
Funnel completion rates and drop-off points at each step.
Conversion rates to campaign creation.
Tags on support tickets related to onboarding issues.
What Surprised Us
Low urgency to complete onboarding
Many users began setup but didn’t complete it in a single session. Some returned days later, while others expressed they weren’t “ready” to use AI or didn’t understand how it could help their business.Misaligned user assumptions
We initially designed for GoHighLevel users and digital agencies. In practice, most users were small business owners without websites or CRM tools. Many didn’t have a clearly defined marketing process, which shaped how they perceived the usefulness of AI.
What We Learned
Users lacked a clear understanding of the AI setup's benefits.
A need to focus on educating users about AI capabilities, not just streamlining the process.
First personality creation experience.
Google Looker data visual.
User personas for individual and agency.
2 Sketch
We explored three onboarding directions, each with a different take on speed, clarity, and control. The goal was to uncover the right balance between automation and user guidance, one that worked for both individual users and agencies.
We Explored Possibilities
1. High User Control
We used a chat-style interface to guide users through personality setup, automate their knowledge base, and allow full editing of AI content during onboarding. While this offered flexibility and control, it led to longer-than-expected completion times and high drop-off.
2. Express Setup
A 45-second version that scraped website content to auto-generate an AI profile. While it significantly reduced setup time, many users dropped off at the paywall, uncertain about the value it produced or how it reflected their brand.
3. Guided Flow (Final Direction)
We designed a custom, step-by-step path using short, tailored questions. It maintained speed while giving users clarity, light control, and a growing sense of value at each step.
Other Key Focuses
Tone and personality selector to shape AI behavior.
Progress indicator to reduce anxiety around completion.
Branching logic to differentiate experiences for individuals versus agencies.
These concepts were tested using low-fidelity flows with current customers. Feedback quickly shaped the direction, aligning the experience with diverse user types and expectations.
Examples of the different design concepts
3 Decide
Outcome: The guided, question-based onboarding flow emerged as the clear winner. It outperformed other versions across key metrics and provided a smoother, more intuitive experience that users consistently described as confident and easy to follow.
Why the Guided Flow Won
Higher completion rates and fewer drop-offs during onboarding
More positive user feedback - during testing, users felt “confident” and “in control”
Reduced confusion - each question focused attention and lowered cognitive load
Improved accuracy - responses were more consistent and meaningful
Better perceived value - users understood what they were building and why it mattered
The step-by-step model helped demystify AI onboarding for users who initially found the idea overwhelming.
How We Prioritized
We used a mix of:
Stakeholder input
User testing feedback
Live A/B test results (Guided vs. Express vs. High User Control)
We also refined scope using feedback from onboarding support sessions and real usage friction.
What We Cut or Delayed
Manual knowledge base setup was removed from onboarding. Instead, the AI auto-generates it and users can edit it later.
Redundant questions (like campaign knowledge base) were consolidated. Asking the same thing twice, even in different contexts felt extremely repetitive to users.
Typing-heavy inputs were minimized in favor of smart defaults and one-click answers, speeding up the flow without sacrificing data quality.
Guided onboarding user journey map
Express user journey breakdown
4 Prototype & Testing
We explored and validated all three onboarding paths (high user control, express, and guided flow) through low-fidelity sketches, high-fidelity prototypes in Figma, and live builds in Bubble. Testing was conducted in Maze for unmoderated sessions, with real-time feedback captured from live product usage.
Tools Used
Figma and Bubble for design and prototyping
Maze for unmoderated testing
Zoom for internal reviews and alignment
Test Environments
Early concepts were tested as static and clickable flows
Live prototypes were released to users in stages to capture actual behavioral data
3. Guided Flow (Final Direction)
We designed a custom, step-by-step path using short, tailored questions. It maintained speed while giving users clarity, light control, and a growing sense of value at each step.
Collaboration and Feedback Loops
We collaborated closely with Marketing, Sales, Customer Support, and Engineering to ensure the onboarding experience aligned with messaging, customer expectations, and technical feasibility. Feedback from these teams helped shape how much to frontload configuration and how to ease users into campaign creation.
Iteration Highlights
Chat-style onboarding gave too much flexibility upfront, which led to delays and overthinking
Express setup surfaced value quickly but didn’t build enough trust
Guided onboarding tested highest in clarity, perceived value, and overall completion rates
Optimized for Success
Refined designs built on insights and results.