When the Data Said Our Facebook Campaigns Were Lying to Us

The number in Facebook Ads Manager was telling us something different from what Matomo was telling us. When I worked out why, the conversation that followed was not one the social media specialist particularly enjoyed.

This is a story about what happens when you build proper cross-channel tracking — and what to do when the data stops agreeing with itself.

The problem with trusting any single source

Our Facebook campaigns appeared, on paper, to be performing well. Reasonable click-through rates, a cost-per-acquisition we were comfortable with, and a reported return on ad spend that justified the monthly budget. The social media specialist I worked with had every right to be pleased with the numbers.

The issue was attribution.

Facebook’s Ads Manager — like most paid channel reporting — uses last-click attribution as its default. Any conversion where Facebook was the last touchpoint gets full credit. What it doesn’t show you is that many of those users had already been acquired through another channel. They’d come through organic search first, or opened an email, clicked through, left, and only returned via a retargeted Facebook ad. Facebook counted the conversion. It counted it as if it had done all the work.

What Matomo showed instead

Part of my job was to provide the social media specialist with a Looker Studio dashboard pulling data from Matomo alongside Facebook campaign metrics — reach, impressions, CTR, conversion rate, CPA — all the standard metrics needed to optimise the campaigns.

But Matomo gives you the full visitor journey. I could see the actual sequence of channels before a conversion. When I started cross-referencing the users Facebook was claiming credit for, a pattern emerged: a significant portion had first come to the site through organic search or email. Facebook retargeting had reached them later. In those cases, the conversion would very likely have happened without the ad spend.

I built a segment in Matomo specifically to isolate new users acquired through Facebook — visitors for whom Facebook was genuinely the first touchpoint, not just the last one. The conversion rate for that group was notably lower than the blended number Facebook Ads Manager was reporting.

The conversation

Showing this analysis to the social media specialist was not straightforward. The campaigns had been performing ‘well’ for months. Questioning that — even with data — feels like an attack on someone’s work.

What helped was framing the problem as a tooling issue, not a performance issue. Facebook’s attribution is a feature of how the platform works, not evidence of anything wrong with the campaigns. The question wasn’t whether the campaigns were good — it was whether we were measuring the right thing. When I put it that way, the conversation shifted from defensive to collaborative.

We agreed to run a structured test: pause retargeting to a subset of the audience for a period, hold everything else constant, and compare conversion volume. The holdout approach is the only honest way to measure incrementality — it gave us data we could act on rather than argue about.

What changed

The test confirmed what the Matomo analysis had suggested. Retargeting campaigns targeting users who had already converted or visited the site multiple times were delivering very little incremental value. Most of those users were going to convert anyway.

We reallocated part of that budget toward upper-funnel activity — reaching genuinely new audiences rather than following existing ones around the internet. The specialist was able to show the commercial team a more defensible set of numbers, and I got a dashboard I actually trusted.

The broader lesson: any channel that controls its own reporting will, by default, tell you a story that favours itself. Last-click attribution doesn’t make Facebook wrong — it makes it incomplete. The analyst’s job isn’t to take that number at face value. It’s to build the view across channels and ask the uncomfortable questions when the numbers don’t add up.

What to check if you find yourself in the same position

  • Check your attribution windows. Facebook’s default may not match what your analytics platform is measuring. Compare like with like before drawing any conclusions.
  • Segment new vs. returning users. In Matomo or GA4, build a segment for users whose first-ever session came from the paid social channel. That’s your baseline for true acquisition performance.
  • Look at the full journey. Matomo’s multi-channel attribution reports show you where a channel sits in the conversion sequence, not just whether it appears at the end.
  • Run a holdout test. Pause retargeting to a subset of your audience for two to four weeks and compare conversion rates. It’s the closest thing to a controlled experiment you’ll get in paid social.

The data didn’t say the campaigns were failing. It said we were measuring them incorrectly. That’s a much easier problem to fix — if you’re willing to look for it.

Leave a Reply

Your email address will not be published. Required fields are marked *