Your strategy isn’t broken, your validation is
Most marketing strategies fail not because they're poorly built, but because they're validated by people who already agree with you
I saw a post from my colleagues at Nutcracker Agency a while back that’s stuck with me. They referenced Chris Voss, the former FBI lead negotiator, who said the turning point in any persuasive interaction is when your partner says, “That’s right.” Not “you’re right,” which is just polite agreement. “That’s right” means they feel heard, understood, seen.
Then they asked a question that probably made a few marketing heads squirm: “Does your social media strategy resonate more with you when it should speak directly to your audience?”
That question applies to everything. Your campaigns. Your positioning. Your entire strategic framework.
Most strategies don’t fail because they’re poorly constructed. They fail because they’re built on internal validation instead of customer understanding. You’re optimising for applause from the boardroom when you should be earning recognition from buyers.
You’re validating with people who already agree with you
Here’s what happens. You spend weeks building a strategy. Research, competitor analysis, SWOT matrices, the works. You present it to leadership. They love it. The deck looks sharp. The logic holds. Everyone nods.
So you execute. And three months later, you’re staring at mediocre results wondering what went wrong.
What went wrong is you validated your strategy with people who already agree with you. Your team, your executives, your internal stakeholders. They all speak the same language, share the same context, operate under the same assumptions. When they say your strategy makes sense, they mean it makes sense to them.
Your prospects don’t care what makes sense to you.
Consider what we pulled off with Printt, an online printing company getting squeezed by competitors targeting their branded search terms. Printt’s original approach was throwing money at paid traffic, watching growth stall, and hoping something would shift.
We could’ve built a perfectly logical strategy around “increasing market share through enhanced digital presence.” Sounds great in a meeting. Means nothing to someone searching for printing services at 11pm who just needs their business cards by Friday.
Instead, we cut Printt’s Google Ads spend by £93k whilst increasing ROI by 47%. We hit 200% monthly ROAS and drove 19,200 confirmed orders. Not because we had a prettier strategy document. Because we stopped validating against internal metrics and started validating against buyer behaviour.
“Looking good” has replaced “working well”
I’ve watched teams defend strategies that aren’t working because, on paper, they still look brilliant. The targeting is sophisticated. The messaging framework is tight. The attribution model is comprehensive.
All true. Also irrelevant if prospects aren’t responding.
There’s a case from the gaming world that illustrates this perfectly. Playstack published Balatro, a poker-themed roguelike that’s now sold over 5 million copies. Early on, they tried the conventional approach: gameplay trailers, voiceovers, flashy cuts highlighting the joker mechanics.
All the trailers looked professional. They’d work great in a pitch to investors. They bombed with actual players because Balatro’s magic isn’t in watching UI elements. It’s in experiencing the rush of a perfect run.
Playstack scrapped their polished trailers and pivoted to streamer content, hosting a tournament with six creators before launch. They failed fast, validated differently, and built a campaign around genuine player reactions instead of what “should” work for a game launch.
Sometimes your strategy is beautifully constructed for the wrong question.
Your sales team holds the answers you’re ignoring
Nutcracker was founded on a simple belief: sales and marketing should be aligned for effective commercial results. Sounds obvious. Rarely happens.
Your sales team talks to prospects every single day. They hear the objections, the questions, the hesitations. They know which features prospects care about and which ones get blank stares. They understand the actual buying journey, not the theoretical one in your customer journey map.
When your strategy gets validated by everyone except the people who live in the trenches with your buyers, you’ve validated the wrong thing.
This isn’t about running every strategic decision through sales. It’s about recognising that if your sales team is confused by your positioning, your prospects definitely are. If they can’t explain your value prop in a sentence, neither can your market.
Strategy built in isolation from sales isn’t customer-centric. It’s internally optimised performance art.
Real validation forces you to question what you’d rather keep as given
Real validation is uncomfortable. It forces you to test assumptions you’d rather keep as givens.
Take your targeting. You’ve identified your ICP based on firmographics, industry verticals, company size. Makes sense internally. But when did you last validate that these companies actually have the problem you solve, right now, urgently enough to take action?
Or your messaging. Your positioning framework passed the executive review. Your brand team loves it. Crucial question: when prospects read it, do they mentally nod and think “that’s right,” or do they scroll past because it sounds like every other vendor in your space?
Your content strategy might have editorial calendars, SEO optimisation, distribution plans. But if you’re creating content that should perform well according to search volume instead of content that matches actual buyer intent, you’re validating against Google’s data instead of your customers’ needs.
The Printt case demonstrates this. Online printing is commoditised. High search volumes for generic terms. Easy to build a strategy around chasing those keywords. We validated differently. We identified what drove actual orders, cut wasteful spend, and focused on intent that converted.
That validation came from performance data, not strategy documents.
Calling it “agile” doesn’t excuse the lack of conviction
Teams love calling their strategy “agile” to avoid admitting they’re just reacting without conviction. Real agility isn’t changing direction every quarter because your current approach isn’t working. It’s building in validation mechanisms from the start so you can pivot based on evidence, not panic.
Our approach at Nutcracker emphasises this: your strategy should be “agile, adaptable, and guided by data.” Not rigid and unchangeable, but also not rudderless. You need the discipline to test, measure, and adjust based on what you learn, not what you hoped would be true.
The difference? A validated strategy that needs adjustment is refined. An invalidated strategy that needs overhaul is replaced.
Three questions that reveal if your validation is broken
Look at your current strategy. Now answer these honestly:
When was the last time a prospect said something that made you question a core assumption in your approach? If the answer is “never” or “I can’t remember,” you’re not validating with the right people.
Can your sales team clearly articulate your strategy to a prospect in a way that earns “that’s right” instead of polite confusion? If not, your validation loop is broken.
Are you measuring success by metrics that directly correlate with revenue, or proxies that make your reports look good? Impressions, engagement, content downloads... these matter only if they lead somewhere commercial.
Your strategy might be perfectly logical. But logic validated by people who already agree with you isn’t customer validation. It’s confirmation bias with better formatting.
Your strategy might be perfectly logical. But logic validated by people who already agree with you isn’t customer validation. It’s confirmation bias with better formatting.
Ready to find out where your validation is actually breaking down?
I’ve built a diagnostic that evaluates your strategy validation process and shows you exactly what needs fixing. Takes 4 minutes (under 5 minutes if you like spiel).
Then come back and tell me: what surprised you most about your results?


