Click Attribution Is
Dead; AI Killed It
Everyone loves a good “AI overview,” right? Right?!
By: Tyler Davis, Digital Marketing Supervisor
You’ve likely seen them pop up in some manner on your web browser or email inbox. They serve to condense and simplify information.
To the users, they’re a lifesaver. To those living in-platform managing the ad budget and deciphering data signals, they’re a nightmare.
Why? They distort signals that would have become clicks, making for a ‘whack-a-mole’ approach to backend attribution on the back end.
I’m not a fan of over‑complicated “learnings” haphazardly packaged into a derivative blog. So we’ll talk about AI vs. attribution through the lens of the everyday layperson—not a mad scientist digital marketer.
Expect a fresh perspective and a few tears shed (unlikely on the latter).
Lackluster Attribution: Comparing Ai Overviews To High School
For this concept, let’s use a metaphor most can relate to. Think back to any of your high‑school group projects: grades that didn’t reflect the uneven effort.
Jock, nerd, prep, goth – didn’t matter. You probably met Frank. Frank did “vibes.” The A‑student wrote the paper, built the deck, color‑coded the chaos. Frank read the title slide aloud and collected a passable grade anyway.
That’s especially true for attribution on the internet now-a-days.
The A‑student is the open web – publishers, creators, and brands who provide the meat and potatoes that match a user’s search intent. Frank is AI search overviews – a neat summary that paraphrases the work up top. And the other channels (that ad, that branded search, that email) are the students who floated Frank to a better grade than he would have had otherwise.
Problem is… do you think Frank holds himself accountable in this situation? Does he own up to his lackluster effort and sheepishly admit that the other students carried the burden?
No, of course not. Or we’d end the scenario right here.
Who Deserves The Credit?
Quick reality check: In a recent study published by Pew Research Center, Google users who encountered an AI summary “rarely clicked on a link in the summary itself.” A whopping 26% of users who encountered a Google Ai Overview ended their browsing session immediately after.
If we only credit the presenter, we miss the author. And when you ask each platform who did the work, they all say “me” – which is very Frank of them.
Meta ads, Google ads, LinkedIn ads, etc., are all tripping over themselves to attribute a click to their platform.
Take a user who’d seen a Facebook or LinkedIn ad in the wild, then clicked on a Google Ad with intent to further research the product or service.
All three platforms are going to fight to claim that click.
But what happens when there’s no click? When there’s no evidence of a user interaction because they’ve read an overview (or title summary slide) and called it a day..
Controversy Surrounding Ai Overviews
Here’s where our group project spills into the principal’s office.
A major U.S. publisher – Penske Media (Rolling Stone, Billboard, Variety) – just slapped Google with a lawsuit. Penske claims “about 20 percent of Google searches that link to one of Penske’s sites now have AI Overviews.”
The claim here is that Google is steering traffic away from the originator of the content which has an impact on affiliate revenue by killing search referrals.
Google’s reply? AI Overviews “help users” and “broaden discovery.”
Translation for non‑marketers: the kid who read the title slide is arguing it inspired the audience more than the actual content.
No Happy Ending.. Yet
There isn’t a tidy fix for the click vs. ai overview conundrum.
No single setting. No magic button.
So what do you do right now?
- Educate clients: explain “answers without clicks,” show how that distorts last‑click credit, and set expectations.
- Shift to full‑funnel reporting: watch blended metrics (aggregated CAC, revenue per visitor, share of branded search), not just channel ROAS.
- Measure assists: use assisted‑conversion views, view‑through where appropriate, and simple incrementality tests (holdouts/lift) to see what actually moves outcomes.
Until the rules change (or the courts do it for them), the win is smarter expectations and blended attribution. Frank still shows up to the presentation; you just stop giving him the A+.
Let’s just say… it wouldn’t be the first time someone’s gotten all the praise for a team’s work behind the scenes.