Analytics & CRO
Technical SEO became my problem the day our Core Web Vitals scores tanked after a Next.js migration. The sprint was closed, the engineering team had moved on, and Google Search Console was showing a sharp increase in “poor” URLs across the site. Nobody had checked LCP before the release went out.
That’s the pattern with technical SEO: it doesn’t stay in the SEO team’s lane. Every deployment is a technical SEO event. Every redirect, every change to URL structure, every page added to or removed from the sitemap affects how Google understands and ranks the site. As a Product Owner, you don’t need to be the person who implements the fixes. But you do need to understand what the data is telling you, so you can prioritise the right things at the right time.
What technical SEO actually covers
Technical SEO is the set of site health factors that affect how search engines crawl, index, and rank your pages — independent of the content on those pages. It covers site speed and Core Web Vitals, mobile usability, crawl budget, URL structure, canonical tags, redirect chains, structured data, and XML sitemaps.
The reason analytics matters here is prioritisation. A GSC Coverage report might surface 40 issues. Not all of them cost you rankings equally. Analytics data — session volume by page, organic traffic trends, impression counts from GSC — tells you which issues are connected to your highest-value pages, and which are technical debt that will never affect real traffic.
Core Web Vitals: where analytics and technical SEO meet
Google’s Core Web Vitals — LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), and INP (Interaction to Next Paint) — are both a ranking factor and a user experience signal. The analytics angle is important: poor scores don’t affect all pages equally, and they don’t affect all users equally.
After our Next.js migration, GSC showed “poor” LCP scores concentrated on our search results pages — the highest-traffic, most commercially significant pages on the site. That told me exactly where to focus engineering time. We didn’t need to fix every page. We needed to fix those pages first, because that’s where the ranking impact and the user cost were largest.
This is the pattern that repeats: technical SEO tells you what’s broken. Analytics tells you whether it matters enough to act on now.
Crawl budget and redirect chains
Crawl budget is how many pages Googlebot will crawl on your site in a given period. For large sites, it’s a real constraint. If Googlebot is spending time on redirect chains, paginated archives, or pages with no SEO value, it’s time not spent on content that matters.
Analytics helps in two ways. First, session data shows you which pages generate meaningful traffic — and by extension, which pages are worth Googlebot’s time. Second, GSC crawl reports show which URLs are being crawled most frequently, which surfaces chains and dead ends you might not have known about.
The redirect work I’ve done on this blog — collapsing multi-hop chains, removing obsolete category URLs, updating internal links that point to redirected destinations — was driven by exactly this combination. GSC coverage reports showed the problem. Matomo traffic data told me which chains were connected to pages with real impressions, and therefore which ones to fix first.
Internal linking as a deliberate product decision
Internal links distribute PageRank across your site. Pages with more internal links pointing at them tend to rank better, all else being equal. Most teams don’t make this decision deliberately — internal links accumulate based on what content existed at the time of writing, not based on what you want to rank.
Analytics makes this less arbitrary. If you know which posts have high impression volume but are stuck at position 11–20, those are the pages that benefit most from additional internal link equity. You find them in GSC, audit which existing posts could reasonably link to them, and update those posts. It’s a low-effort intervention with a measurable effect — and it’s one of the few technical SEO levers a non-engineer can pull directly.
The practical starting point
If you haven’t looked at technical SEO through an analytics lens before, the entry point is simple: read GSC’s Coverage report and Core Web Vitals report alongside your traffic data by page. That combination tells you which pages Google is struggling with, which of those pages actually drive traffic, and therefore which technical issues to address first.
Technical SEO fixes without analytics prioritisation produce motion without direction — you end up fixing the most visible issues rather than the most valuable ones. The commercial data is what transforms a technical audit into a prioritised backlog.
For the analytics layer, Matomo Tag Manager event tracking adds something that pure pageview data can’t: you can measure whether technical improvements actually change user behaviour, not just whether they move the metric that was targeted. That feedback loop is what turns one-off fixes into a repeatable process.
