What to do when your test flops

Dan Dovaston
Head of Delivery

So you ran the test. You put your idea in front of real people - maybe a landing page, maybe a prototype, maybe a set of conversations with the exact audience you were building for - and the results came back… flat. Or worse, actively discouraging. Nobody signed up. Nobody clicked. The people who did click didn't do the thing you needed them to do.

Right now you're probably somewhere between I knew this was too good to be true and maybe the test was wrong. Both feelings are completely normal, and neither is particularly useful on its own. So let's talk about what actually just happened, and what you should do next.

A failed test isn't a failed idea

This is the bit most people get wrong, and it's worth being precise about. When a validation test comes back negative, it hasn't invalidated your idea. It's invalidated an assumption - one specific thing you believed to be true about your market, your audience, your pricing, your messaging, or the problem you're solving. That's a much smaller thing than the whole idea is dead, even though it doesn't feel smaller right now.

But here's the honest flip side: sometimes the assumption that just got knocked down is load-bearing. If your test was "do people actually have this problem?" and the answer came back no, that's a different conversation to "do people prefer blue buttons or green buttons?". The size of the assumption matters.

A test that tells you the idea doesn't work has done its job. The failure is skipping the test entirely.

And that's genuinely worth sitting with. You are now in a better position than the version of you who hadn't tested anything at all - the one who'd have spent six months and a significant chunk of savings building something nobody wanted. That version of you would love to trade places.

Before you do anything, diagnose what actually broke

The temptation after a bad result is to either bin the idea completely or pivot immediately to something new. Both are emotional responses dressed up as strategic ones. Before you move, you need to figure out what the test actually told you - which means being honest about what you were testing and how clean the test was.

Ask yourself three things:

  1. Was the test designed well? Did you actually isolate the thing you were trying to learn, or were there three or four variables tangled up together? If your landing page flopped, was it the value proposition that didn't land, or was it that you drove the wrong traffic to the page? A poorly designed test gives you noise, not signal. That's not the idea's fault - it's the test's fault, and it means you need to run a better one.
  2. What specific assumption got hit? Get granular. "People didn't want it" isn't useful. "Small business owners in our target segment didn't see enough value in the time-saving aspect to give us their email" - that's something you can work with. The more precisely you can name what didn't hold up, the more options you have for what to try next.
  3. Is this a surface problem or a structural one? There's a real difference between "the way we described this didn't resonate" and "the problem we're solving isn't painful enough for anyone to pay for a solution." The first is a messaging issue. The second is a market issue. One of them is fixable in an afternoon. The other might not be fixable at all.

Pivot, iterate, or stop?

Once you've done the diagnosis, you're typically looking at one of three paths - and the right one depends entirely on what broke.

Iterate when the core problem is real but your approach to it was off. Maybe the audience is right but the solution shape is wrong. Maybe the solution is right but you were talking to the wrong segment. Iteration means keeping the foundation and changing something specific - your pricing model, your target user, the feature you led with, or how you framed the whole thing. Most tests that "fail" actually land here.

Pivot when the test reveals a different, more interesting problem nearby. This happens more than you'd think. You set out to solve Problem A, and during your testing you kept hearing people talk about Problem B - something adjacent that clearly kept them up at night. A pivot isn't failure, it's following the signal to where the energy actually is.

Stop when the evidence is genuinely telling you that the problem isn't painful enough, the market is too small, or the economics simply don't work. And look - this is the hard one. Nobody wants to hear it, and we're not going to dress it up with false optimism. Sometimes the kindest, smartest thing your test data can do is save you from spending another year on something that won't get there. That's not a defeat. That's the whole point of testing early.

The worst outcome isn't a test that says stop. It's never testing at all and finding out two years and £100k later.

What to do right now

If you're sitting with a result that feels bad, here's the practical version. Write down exactly what you tested and exactly what happened - not how it made you feel, but what the numbers or feedback actually said. Separate the assumption from the idea. Talk to someone who wasn't emotionally involved in the test and get their read on the data. And resist the urge to make a big decision today.

The best founders we work with at Rise aren't the ones whose first test always works - those people are mostly just lucky, or they're not being honest about their numbers. The best founders are the ones who treat a bad result as information, figure out what it means, and decide what to do next with a clear head.

If you're not sure what your test results are really telling you, that's a good reason to talk it through with someone who reads this kind of data every week. We offer a free 30-minute discovery call with one of Rise's founders - no obligation, no pitch, just an honest look at what you've got and where it points. Book a call and bring your results. We'll help you figure out what actually happened.

Ready to take action?

The hardest part is having an idea. The next step is easy...

30 minutes. One conversation. No obligation.

Similar posts