How to Build and Validate an MVP in Two Weeks
How we helped InCalendar validate their product idea, get pre-orders, and prepare for fundraising, all before writing a single line of code

The problem InCalendar wanted to solve
InCalendar's mission: help professionals take care of their clients without worrying about appointments.
The idea came from three problems that professionals face daily:
Phone interruptions: Answering calls to schedule appointments breaks concentration and workflow
Last-minute cancellations: No-shows and late cancellations are wasted money
Customer marketing: Staying in touch with clients between appointments is hard
Their solution was ambitious: an AI-driven chatbot to manage appointments, with built-in chat, reminders, and a broadcast marketing channel for promotions and upsells.
The concept was big. The risk was equally big. Before investing months of development, InCalendar needed to know: would professionals actually use this?
Why we chose a Design Sprint for validation
A Design Sprint is a structured process for validating product ideas where market risk is high. It compresses months of debate into a focused burst of work: define the problem, prototype a solution, and test it with real users.
For InCalendar, a sprint made sense because:
The product concept was broad and needed focus
The founders had assumptions about user behavior that needed testing
Building the real product would take months and significant investment
Working closely with InCalendar's two founders, we started by framing the idea using a Lean Canvas to map their first business hypothesis: who is the customer, what's the problem, what's the unique value proposition, and what does the revenue model look like.
Then we made a critical decision: instead of trying to validate everything at once, we picked one core flow to test - appointment creation, the fundamental interaction of the entire product.
This is something we see founders get wrong often. They try to validate the whole vision in one go. That's too much. Pick the riskiest assumption, test that first.
What we built: a landing page and a prototype
We approached validation with guerrilla user testing - getting the product in front of real users at the lowest possible cost.
Two tools made this possible:
1. A landing page focused on benefits
The landing page of a SaaS product is part of the product experience. The promises on this page must be kept by the application.
We brainstormed messaging with the founders, then used dot-voting to select the most promising angles. The final page included branding and a logo created specifically for the test, making it feel like a real product launch.
The goal wasn't to generate traffic. It was to test whether the value proposition resonated when put in front of professionals face-to-face.
2. An installable interactive prototype
The second tool was a high-fidelity interactive prototype. We needed the experience to feel real enough that users would respond as if they were using an actual product.
Together with the founders, we defined the boundaries: one task flow (create an appointment for next Tuesday), fully interactive, installable on a smartphone so it looked and behaved like a native app.
No code was written. The entire prototype was built in a design tool.
The interviews: what we tested and what we learned
With the landing page and prototype ready, we ran structured user interviews with 10 professionals. We designed a script in advance and split each interview into two parts:
Part 1: Landing page test
We asked participants to navigate the landing page and tell us what they understood. This let us iterate on the copy in real time - after each interview, we refined the messaging to focus on the benefits that resonated most.
Part 2: Prototype task
Participants were asked to complete one task: add an appointment for the following Tuesday using the conversational interface.
The results
Three signals told us the validation was successful:
100% acceptance of the conversational interface. Every participant understood and could use the chatbot approach to scheduling. This was the riskiest assumption, and it passed.
2 out of 10 participants tried to pre-order. They asked to buy or reserve the product during the interview. This is the strongest validation signal you can get - not "I would use this" but "how do I get this?"
90% tried to explore beyond the prototype. Nine out of ten participants attempted to use features that weren't in the prototype. They tapped on other parts of the interface, asked about additional functionality. This showed genuine engagement, not just polite compliance.
These aren't just nice numbers. In MVP validation, the difference between stated interest ("Yeah, I'd probably use that") and demonstrated behavior (trying to buy, exploring further) is everything. Behavior doesn't lie.
From validation to fundraising
With the validation data in hand, we packaged everything for InCalendar's next step: raising investment.
We delivered:
An interactive research report summarizing all interview findings, patterns, and quotes
A pitch deck translating the validation results into a compelling investor narrative
The founders could now walk into investor meetings with evidence, not just an idea. They had tested their riskiest assumption, proven demand with real users, and had a clear picture of what to build first.
What you can take from this
If you're validating a product idea, here's the approach distilled into steps you can follow:
1. Frame your hypothesis first
Use a Lean Canvas or similar tool to map your assumptions. Be explicit about what you believe and what you don't know. This is the foundation of outcome-focused design - start with what you're trying to learn, not what you're trying to build.
2. Pick the riskiest assumption
Don't try to validate everything. Choose the one thing that, if wrong, makes the entire product irrelevant. For InCalendar, it was: "Will professionals accept a chatbot interface for appointment management?"
3. Build the cheapest possible test
A landing page and a prototype cost a fraction of actual development. You need just enough fidelity for users to respond authentically. No code required.
4. Define success criteria before testing
Before you run a single interview, decide what "validated" looks like. Is it 3 out of 10 wanting to buy? Is it zero confusion on the core flow? Set the bar first. This prevents you from rationalizing weak results after the fact.
5. Test with real users, not friends
Guerrilla testing with your actual target audience gives you honest reactions. Friends and colleagues will be too kind. You need people who will tell you the truth.
6. Let behavior speak louder than words
Track what people do, not just what they say. Pre-orders, prototype exploration, and task completion rates are stronger signals than enthusiasm in an interview.
Ready to validate your idea?
InCalendar went from an ambitious concept to validated demand in two weeks, without writing a line of code. That's the power of combining a Design Sprint with targeted user research.
If you're building a SaaS product and need to validate before you invest in development, that's exactly what our product design team does. We've helped 80+ SaaS companies figure out what to build and what to skip.
Whether you're launching a new product or running a design sprint to validate your next move, we can help you get answers fast.
Related reading
5 Things Every Founder Should Know About UX - the UX essentials every startup founder needs before building
Principles for Startup Success - the co-design principles that drive successful product teams
What Is Product Design? How to Create User-Centric Products - understanding the full scope of product design beyond UI
Is the SaaS Model Right for Your Product? - evaluate whether SaaS fits your business before committing
A Practical Guide to the Double Diamond Design Process - the discovery and delivery framework behind structured design sprints



