Skip to main content

The Moment Everyone Agrees to Pretend It Makes Sense

As Non-Executive Chair of an online insurance brokerage, I’m currently watching the leadership team conduct yet another round of marketing agency auditions. It’s a familiar pattern I see across my portfolio: impressive presentations, sophisticated attribution models, and metrics that somehow never translate to sustainable customer acquisition.

Watching these presentations reminds me of a recent engagement where I helped a CEO solve exactly this problem. His sales and marketing teams were exceptional, but sales were declining despite impressive marketing metrics. He needed someone who could fix the dysfunction without destroying the relationships.

The CEO sensed the real problem but couldn’t pinpoint exactly what it was. More importantly, he knew that confronting the issue directly would create a war between two exceptional teams when what he needed was performance improvement.

The acid test was undeniable: sales were declining despite impressive marketing metrics.

He called me because he needed someone who could navigate the politics, someone commercially savvy enough to see the real problem and emotionally intelligent enough to guide both teams toward a solution they could own. Not to prove a point, but to improve performance.

I didn’t walk in with answers. I walked in with the right questions.

In our pre-workshop conversations, I posed specific questions to both teams about their data and outcomes. Not accusations, but genuine inquiries framed around optimisation: “How do we make our strong performance even stronger?” By the time the workshop arrived, something remarkable had happened.

Without my prompting, both teams had met independently, realised they needed each other’s data to understand the full picture, and arrived with a joint analysis. More than that, the solution had become so obvious during their collaboration that they’d already implemented it. The workshop transformed from a potential blame session into a momentum-building session focused on “how do we keep improving?”

It wasn’t a talking shop where departments defended their positions. It was a genuine workshop where exceptional people worked together to optimise business results.

The workshop worked because the real work had already been done. Most alignment sessions fail because they try to solve relationship problems when the actual problem is information architecture.

What They Discovered Together

The teams’ joint analysis revealed what I suspected: despite months of impressive marketing presentations, the business fundamentals told a different story:

  • Only 22% of marketing-generated leads converted to sales meetings
  • Sales could attribute just 15% of closed deals to marketing efforts
  • The cost per qualified lead had increased 60% over six months
  • Marketing spend had doubled, but sales appointments hadn’t moved

When I debriefed with the CEO afterwards, he stared at these numbers like they were written in a foreign language. “But the marketing presentations showed traffic was up 40%,” he said.

Traffic was up. The presentations had been polished. The metrics looked professional. But none of it connected to revenue. The teams had discovered that distinction between impressive data and business outcomes and fixed it before I’d even arrived.

The Real Problem: Why Smart People Accept Nonsense

The disconnect isn’t about people, both teams were exceptional. It’s about systems that look sophisticated but deliver nothing. Here’s how it works:

Marketing teams are incentivised to generate activity: clicks, visits, downloads. Sales teams are measured on outcomes: meetings, proposals, closed deals. When these metrics don’t align, the teams can’t align either.

But here’s what most leaders miss: both teams became willing participants in a measurement framework that looked professional but delivered nothing. The marketing team’s framework wasn’t just misaligned, it had been systematically conditioned by the platforms they use. Google’s certification programmes, training academies, and “best practices” have educated an entire generation of marketers to optimise for Google’s business model, not their own company’s outcomes.

The sales team found this convenient. When marketing presented months of improving KPIs with Google’s validation, why question it? When deals didn’t close, they could point to “unqualified leads” rather than examine their own processes. They never demanded revenue attribution. They never defined qualification criteria. They accepted the measurement theatre because it gave them cover.

Both teams found it easier to hide behind impressive metrics than face the brutal reality of their own performance. Marketing could point to traffic growth. Sales could point to “lead quality issues.” Everyone had cover. No one had results.

The Platform Problem

Here’s the uncomfortable truth: UK marketing effectiveness has declined as digital spend has increased. Research from Ebiquity analysing £5 billion worth of UK advertising spend found that digital attribution techniques have “served to inflate the reported ROI from those media lines.” The Institute of Practitioners in Advertising documented “an overall decline in marketing efficiency and ROI in recent years” directly linked to the over-investment in digital performance media.

This isn’t incompetence. It’s conditioning. When Google trains marketers through their certification programmes and academies, they naturally emphasise the metrics that drive Google’s business. Last-click attribution models, conversion tracking focused on platform activity, and optimisation for search volume all benefit Google’s revenue model.

The result? Marketing teams that can recite Google’s attribution models but can’t connect their spend to actual sales outcomes. They’re not being deliberately misled, they’re being educated within a framework designed to optimise for someone else’s business.

Here’s what made this particularly insidious: the data looked impressive. Six months of polished presentations showing traffic up 40%, engagement rates climbing, cost-per-click decreasing. Professional charts, industry-standard terminology, Google’s own benchmarks validating the approach. Why would anyone challenge numbers that looked so convincing?

You’ve been in that meeting. The marketing director starts confidently: “Our multi-touch attribution model shows upper-funnel awareness campaigns driving assisted conversions through cross-device journey mapping…” Thirty seconds in, you’re lost. Sixty seconds in, they’re lost too. Then comes the pivot: nervous laugh, slight panic in their eyes, and the fatal phrase: “Look, the important thing is our ROI is improving. Trust me, the data is solid.”

That smile gives it away. It’s the smile of someone who knows they haven’t actually explained anything but hopes their authority will carry them through. If they can’t explain it simply, they don’t understand it either. They’re just better at hiding confusion behind industry terminology.

But here’s the deeper problem: you nodded anyway. Because questioning it felt like admitting ignorance about something that appeared to be industry standard. Everyone else was nodding. The terminology sounded sophisticated. So you played along with the collective pretence that this sophisticated-sounding nonsense was actually meaningful insight.

It’s the Emperor’s New Clothes, corporate edition. A room full of intelligent people all pretending to understand something that fundamentally doesn’t make sense. The marketing director learned the terminology but not the logic. You accepted it because challenging it felt risky. Everyone became complicit in a shared fiction that platform metrics equal business intelligence.

This is why the conditioning is so effective. It creates a room full of people who all pretend to understand something that fundamentally doesn’t make sense. The marketing director learned the terminology but not the logic. The sales director accepted it because it sounded professional. The CEO nodded along because challenging it felt like admitting ignorance.

Everyone’s complicit in a collective pretence that sophisticated-sounding nonsense is actually meaningful data.

The Solution They Implemented

The teams’ joint analysis led them to a simple operational framework. Instead of measuring marketing success by platform metrics, they’d measure it by sales outcomes. Every marketing pound would be tied to a sales result.

They’d abandoned the platform-driven measurement framework entirely.

The calculation is straightforward: If you need 100 sales, and your demo-to-sale conversion rate is 20%, you need 500 demos. If your website-to-demo conversion rate is 2%, you need 25,000 qualified visitors. Work backwards from revenue, not forwards from activity.

This shifted their entire approach. Marketing stopped optimising for Google’s revenue and started optimising for qualified prospects. Sales stopped dismissing marketing efforts and started providing clear qualification criteria. Both teams began working toward the same outcome: business results, not platform metrics.

The energy in the workshop wasn’t defensive, it was ambitious. They weren’t asking “what went wrong?” They were asking “how do we make this even better?”

The Leadership Blind Spot

What struck me most during this engagement wasn’t the misalignment itself, I see it everywhere. It was how completely invisible it had been to the CEO. He’d been reviewing departmental performance in isolation, missing the operational reality that connected them.

But there’s a deeper issue: most CEOs don’t realise their marketing teams have been systematically trained to optimise for someone else’s business model. Google’s educational infrastructure, from certification programmes to best practice guides, has become the de facto curriculum for digital marketing. When you train an entire generation of marketers to think in terms of Google’s success metrics, they naturally optimise for those metrics.

The irony is that real data scientists would spot this problem immediately. They’d ask the brutal questions: “What business outcome are we measuring?” and “How do we know this metric drives revenue?” But many of the new data-driven marketing leaders aren’t actually data people, they’re traditional marketers who’ve learned to speak the language because it’s fashionable. They have just enough analytical vocabulary to sound credible but lack the rigorous thinking to question the frameworks they’re using.

The tragedy is that Google’s measurement theatre has made everyone forget what marketing actually is. Before attribution models and conversion funnels, marketers were skilled at driving sales outcomes through creativity, intuition, and deep customer understanding. The Hovis boy cycling up the hill sold bread for decades without a single click-through metric, but make your own mash-up of favourite ads (see what I did there?). Real marketers created demand through storytelling, not traffic optimisation.

This is the leadership blind spot that kills growth. When you measure teams separately using platform-driven frameworks, you get platform-optimised results. When you measure them together using business-driven frameworks, you get business results.

The fix isn’t complicated, but it requires discipline. You have to be willing to abandon metrics that make you feel good in favour of metrics that make you money. You have to be willing to have uncomfortable conversations about what’s really driving results.

The Immediate Action

If you’re a CEO reading this, here’s what you do next. Walk into your next marketing meeting and ask one question: “How many sales appointments did last quarter’s marketing spend generate?”

If they can’t answer immediately, you have the same problem my client had. If they deflect to Google’s traffic numbers or engagement rates, you have the same problem. If they blame sales for not converting the leads, you definitely have the same problem.

The solution isn’t more collaboration. It’s breaking free from platform-driven measurement frameworks. Define what success looks like in revenue terms, then work backwards to activity. Measure both teams against business outcomes, not platform metrics. Make the invisible visible.

Most consultants will tell you this is about people and communication. It’s not. It’s about systems and conditioning. Your marketing team has been trained by companies that profit when you spend more. Their definition of success isn’t your definition of success.

Fix the measurement framework, and the people will follow.

The Bottom Line

Sales and marketing alignment isn’t a nice-to-have. It’s an operational imperative. When these teams work in silos, you’re not just losing efficiency, you’re losing revenue. When they work together toward shared outcomes, you’re not just improving collaboration, you’re improving performance.

The question isn’t whether you can afford to fix this. The question is whether you can afford not to.

If you’re a CEO who suspects your sales and marketing teams aren’t as aligned as they should be, the symptoms are usually obvious once you know what to look for. Most leaders just aren’t looking at the right metrics to see them.

Trevor Parker

Trevor supports businesses and senior leadership teams under pressure, serving as Chair, Non-Executive Director, Interim CEO, or Strategic CEO Advisor, depending on what's needed. He steps in when performance has slipped, leadership is stretched, or the path forward isn't clear. He brings stability, restores control, and creates the time and space for management to lead effectively. Whether it's resetting operations, aligning the team, or getting execution back on track, his focus is on helping the business move forward with clarity and confidence.