How to Build and Validate an MVP in Two Weeks

How we helped InCalendar validate their product idea, get pre-orders, and prepare for fundraising, all before writing a single line of code

Giustino Borzacchiello
Giustino BorzacchielloMar 20, 2026
Hourglass with gear representing fast MVP building and validation

TL;DR

TL;DR

InCalendar validated their AI appointment chatbot in two weeks using a Design Sprint, a landing page, and guerrilla user testing with 10 professionals. Two participants tried to pre-order during testing, and 100% accepted the conversational interface. They secured investment-ready validation without writing a single line of code. Around 90% of startups fail. The top reasons? Running out of money, targeting the wrong market, and skipping product research. InCalendar, a Milanese startup, didn't want to become that statistic. They came to us with an ambitious idea and a simple question: is this worth building? Two weeks later, they had their answer, backed by real user data, two pre-orders, and a pitch deck ready for investors. Here's how we did it.

InCalendar validated their AI appointment chatbot in two weeks using a Design Sprint, a landing page, and guerrilla user testing with 10 professionals. Two participants tried to pre-order during testing, and 100% accepted the conversational interface. They secured investment-ready validation without writing a single line of code. Around 90% of startups fail. The top reasons? Running out of money, targeting the wrong market, and skipping product research. InCalendar, a Milanese startup, didn't want to become that statistic. They came to us with an ambitious idea and a simple question: is this worth building? Two weeks later, they had their answer, backed by real user data, two pre-orders, and a pitch deck ready for investors. Here's how we did it.

The problem InCalendar wanted to solve

InCalendar's mission: help professionals take care of their clients without worrying about appointments.

The idea came from three problems that professionals face daily:

  • Phone interruptions: Answering calls to schedule appointments breaks concentration and workflow

  • Last-minute cancellations: No-shows and late cancellations are wasted money

  • Customer marketing: Staying in touch with clients between appointments is hard

Their solution was ambitious: an AI-driven chatbot to manage appointments, with built-in chat, reminders, and a broadcast marketing channel for promotions and upsells.

The concept was big. The risk was equally big. Before investing months of development, InCalendar needed to know: would professionals actually use this?


Why we chose a Design Sprint for validation

A Design Sprint is a structured process for validating product ideas where market risk is high. It compresses months of debate into a focused burst of work: define the problem, prototype a solution, and test it with real users.

For InCalendar, a sprint made sense because:

  • The product concept was broad and needed focus

  • The founders had assumptions about user behavior that needed testing

  • Building the real product would take months and significant investment

Working closely with InCalendar's two founders, we started by framing the idea using a Lean Canvas to map their first business hypothesis: who is the customer, what's the problem, what's the unique value proposition, and what does the revenue model look like.

Then we made a critical decision: instead of trying to validate everything at once, we picked one core flow to test - appointment creation, the fundamental interaction of the entire product.

This is something we see founders get wrong often. They try to validate the whole vision in one go. That's too much. Pick the riskiest assumption, test that first.


What we built: a landing page and a prototype

We approached validation with guerrilla user testing - getting the product in front of real users at the lowest possible cost.

Two tools made this possible:


1. A landing page focused on benefits

The landing page of a SaaS product is part of the product experience. The promises on this page must be kept by the application.

We brainstormed messaging with the founders, then used dot-voting to select the most promising angles. The final page included branding and a logo created specifically for the test, making it feel like a real product launch.

The goal wasn't to generate traffic. It was to test whether the value proposition resonated when put in front of professionals face-to-face.


2. An installable interactive prototype

The second tool was a high-fidelity interactive prototype. We needed the experience to feel real enough that users would respond as if they were using an actual product.

Together with the founders, we defined the boundaries: one task flow (create an appointment for next Tuesday), fully interactive, installable on a smartphone so it looked and behaved like a native app.

No code was written. The entire prototype was built in a design tool.


The interviews: what we tested and what we learned

With the landing page and prototype ready, we ran structured user interviews with 10 professionals. We designed a script in advance and split each interview into two parts:

Part 1: Landing page test

We asked participants to navigate the landing page and tell us what they understood. This let us iterate on the copy in real time - after each interview, we refined the messaging to focus on the benefits that resonated most.

Part 2: Prototype task

Participants were asked to complete one task: add an appointment for the following Tuesday using the conversational interface.


The results

Three signals told us the validation was successful:

  1. 100% acceptance of the conversational interface. Every participant understood and could use the chatbot approach to scheduling. This was the riskiest assumption, and it passed.

  2. 2 out of 10 participants tried to pre-order. They asked to buy or reserve the product during the interview. This is the strongest validation signal you can get - not "I would use this" but "how do I get this?"

  3. 90% tried to explore beyond the prototype. Nine out of ten participants attempted to use features that weren't in the prototype. They tapped on other parts of the interface, asked about additional functionality. This showed genuine engagement, not just polite compliance.

These aren't just nice numbers. In MVP validation, the difference between stated interest ("Yeah, I'd probably use that") and demonstrated behavior (trying to buy, exploring further) is everything. Behavior doesn't lie.


From validation to fundraising

With the validation data in hand, we packaged everything for InCalendar's next step: raising investment.

We delivered:

  • An interactive research report summarizing all interview findings, patterns, and quotes

  • A pitch deck translating the validation results into a compelling investor narrative

The founders could now walk into investor meetings with evidence, not just an idea. They had tested their riskiest assumption, proven demand with real users, and had a clear picture of what to build first.


What you can take from this

If you're validating a product idea, here's the approach distilled into steps you can follow:


1. Frame your hypothesis first

Use a Lean Canvas or similar tool to map your assumptions. Be explicit about what you believe and what you don't know. This is the foundation of outcome-focused design - start with what you're trying to learn, not what you're trying to build.


2. Pick the riskiest assumption

Don't try to validate everything. Choose the one thing that, if wrong, makes the entire product irrelevant. For InCalendar, it was: "Will professionals accept a chatbot interface for appointment management?"


3. Build the cheapest possible test

A landing page and a prototype cost a fraction of actual development. You need just enough fidelity for users to respond authentically. No code required.


4. Define success criteria before testing

Before you run a single interview, decide what "validated" looks like. Is it 3 out of 10 wanting to buy? Is it zero confusion on the core flow? Set the bar first. This prevents you from rationalizing weak results after the fact.


5. Test with real users, not friends

Guerrilla testing with your actual target audience gives you honest reactions. Friends and colleagues will be too kind. You need people who will tell you the truth.


6. Let behavior speak louder than words

Track what people do, not just what they say. Pre-orders, prototype exploration, and task completion rates are stronger signals than enthusiasm in an interview.


Ready to validate your idea?

InCalendar went from an ambitious concept to validated demand in two weeks, without writing a line of code. That's the power of combining a Design Sprint with targeted user research.

If you're building a SaaS product and need to validate before you invest in development, that's exactly what our product design team does. We've helped 80+ SaaS companies figure out what to build and what to skip.

Whether you're launching a new product or running a design sprint to validate your next move, we can help you get answers fast.

Book a discovery call


Related reading

Title

Got questions?

How long does a Design Sprint take?

A standard Design Sprint runs five days, but some teams compress it into three or four. The key is having dedicated time with the right people in the room, not stretching it across weeks of part-time work.

How long does a Design Sprint take?

A standard Design Sprint runs five days, but some teams compress it into three or four. The key is having dedicated time with the right people in the room, not stretching it across weeks of part-time work.

How long does a Design Sprint take?

A standard Design Sprint runs five days, but some teams compress it into three or four. The key is having dedicated time with the right people in the room, not stretching it across weeks of part-time work.

Do I need a working product to validate my idea?

No. A high-fidelity prototype built in a design tool can feel real enough to test with users. The goal is to simulate the core experience, not build the actual technology behind it.

Do I need a working product to validate my idea?

No. A high-fidelity prototype built in a design tool can feel real enough to test with users. The goal is to simulate the core experience, not build the actual technology behind it.

Do I need a working product to validate my idea?

No. A high-fidelity prototype built in a design tool can feel real enough to test with users. The goal is to simulate the core experience, not build the actual technology behind it.

How many users should I test with?

Research shows that 5 users reveal about 85% of usability problems. For broader validation like InCalendar's, 8-10 users give you enough signal to spot patterns and make confident decisions.

How many users should I test with?

Research shows that 5 users reveal about 85% of usability problems. For broader validation like InCalendar's, 8-10 users give you enough signal to spot patterns and make confident decisions.

How many users should I test with?

Research shows that 5 users reveal about 85% of usability problems. For broader validation like InCalendar's, 8-10 users give you enough signal to spot patterns and make confident decisions.

What if my validation results are mixed?

Mixed results are still valuable. They tell you which parts of your concept work and which need rethinking. The worst outcome isn't a mixed signal, it's building for months without any signal at all.

What if my validation results are mixed?

Mixed results are still valuable. They tell you which parts of your concept work and which need rethinking. The worst outcome isn't a mixed signal, it's building for months without any signal at all.

What if my validation results are mixed?

Mixed results are still valuable. They tell you which parts of your concept work and which need rethinking. The worst outcome isn't a mixed signal, it's building for months without any signal at all.

How much does MVP validation cost compared to building the full product?

A Design Sprint with prototyping and user testing typically costs $5K-$15K. Compare that to months of development that can easily run $50K-$200K. Validation is a fraction of the cost and prevents you from building something nobody wants.

How much does MVP validation cost compared to building the full product?

A Design Sprint with prototyping and user testing typically costs $5K-$15K. Compare that to months of development that can easily run $50K-$200K. Validation is a fraction of the cost and prevents you from building something nobody wants.

How much does MVP validation cost compared to building the full product?

A Design Sprint with prototyping and user testing typically costs $5K-$15K. Compare that to months of development that can easily run $50K-$200K. Validation is a fraction of the cost and prevents you from building something nobody wants.

Can I run a Design Sprint without a design agency?

Yes, if you have someone on your team with facilitation experience and access to a prototyping tool. The structured process is what matters. An agency brings speed and experience, but the framework itself is open and well-documented.

Can I run a Design Sprint without a design agency?

Yes, if you have someone on your team with facilitation experience and access to a prototyping tool. The structured process is what matters. An agency brings speed and experience, but the framework itself is open and well-documented.

Can I run a Design Sprint without a design agency?

Yes, if you have someone on your team with facilitation experience and access to a prototyping tool. The structured process is what matters. An agency brings speed and experience, but the framework itself is open and well-documented.

We’ll help you build the
right product, faster

The first step is a quick chat

Donux srl © 2026 Via Carlo Farini 5, 20154 Milano P.IVA IT11315200961

Part of

We’ll help you build the
right product, faster

The first step is a quick chat

Donux srl © 2026 Via Carlo Farini 5, 20154 Milano P.IVA IT11315200961

Part of