Counting Ghosts

A case for abandoning web analytics
By P.C. Maffey

In 2019, I wrote Roll Your Own Analytics, my simple approach to privacy-first web analytics. Since then, my feelings have evolved: I've dropped analytics from all my active projects. This is why.
Published 9/14/2023
5 pages

Running web analytics is Standard Operating Procedure for any "real" project online. Just add Google Analytics—and watch the data flow. It's univerally effortless and provides a near instant feedback loop. A totally normal, accepted tactic in the playbook for online success.

I want to challenge that notion.

Over the past five years or so, the privacy-first analytics scene exploded. The case is simple: I want to monitor my project's performance to improve it, without feeding the ad-tech machines that are eating the web.

Despite the rise of alternative analytics, despite GDPA making 3rd-party tracking a tad more cumbersome, despite Google Analytics now being illegal in the EU—there's hardly a dent in GA's market share.

For the last 15 years, everyone's been playing the game run by Google and Facebook, to compete for people's attention. A playbook writ large with gamification, manipulation, and extraction. Incentivized by optimizations towards local maxima. Powered by behavioral data collection. And codified into tech canon by books like Hooked (written by this guy).

Analytics feeds that "virtuous cycle." Many paychecks depend on it.

I've no illusions this will change any time soon. From basic product telemetry to survelliance capitalism, no one's asking: what are the costs of running web analytics? What are alternative means of production?

Missed connections

I have a theory about why businesses lose their soul as they grow.

The hard work of understanding people isn't scalable. An organization needs to commodify our ability to make connections, by packaging what's human into something repeatable and quantifiable.

This isn't to say that interactions should still require a handshake and a chat about the grandparents. Transactional efficiency is one of the primary drivers of progress in the world. Especially when blind.

Behavioral analytics exploits that dynamic. It adds hidden friction to both sides of the relationship, no matter how transactional.

On one side: through the lens of analytics, we see our customers, readers, users—real people—as dissected boxes, shaped by the events we track and textured with as much personal information as we are comfortable gleaning from them unawares. Analytics serves as a proxy for understanding people, a crutch we lean into. Until eventually, instead of solving problems, we are just sitting at our computer counting ghosts.

On the other side: data is almost always collected without our knowledge or will, even if technically, we've given consent. It's no coincidence that the people who most understand tracking capabilities are the one's most likely to run an ad-blocker and opt out of data collection. Survelliance erodes trust, whether people explicitly care about privacy or not. Survelliance erodes trust. Survelliance erodes trust. If I micromanage you, you'll be inherently defensive, less willing to listen, or believe what I have to say, or give attention to what I'm selling...

Which is ironic, because we spent $350 billion dollars on ads in the US last year trying to buy attention. Innumerably more trying to earn it. Trust is the cornerstone of every business not based exclusively on extracting resources. And here we are fingerprinting browsers, tracking your every movement online, all so we can sell you what we want you to want.

Obviously, these soft costs (disconnect, loss of empathy, trust) are hardly quantifiable, and thus, easy to ignore. But when we still don't know which half of our ad spend is wasted, or why some of our content unexpectedly performs so much better than others, the value proposition of analytics begins to look more and more like a trap.


Every tracking script added to a website, every event logged, every network request sent adds a bit of drag to the actual activity happening on the page. A site's performance suffers from wasted cycles collecting and processing analytics data.

The same can be said of our minds. Our work. And our business operations. How much time and energy do we spend obsessing over dashboards and stats? KPI's and fake internet points? Rushing up every little hill in search of optimization gold?

We're tricking ourselves, thinking we're measuring something real, performing some quasi-scientific activity, because we're looking at numbers, making data-driven decisions. Real user events! Web analytics resembles a pseudo social science, data confounded by the complexities of human behavior. No matter how many A/B tests we run, when our paycheck (or status) depends on the results, our analysis lack objectivity.

The real danger there: when we choose an erroneous path based on anlaysis steeped in our confirmation biases.

Web analytics sits in the awkward space between empirical analysis and relationship building, failing at both, distracting from the real job to be done: making connections, in whatever form that means for our project.

That mattered less in an era of free money and endless new eyeballs. When we threw money at anything that moved. But the gold rush of the web is coming to an end. The world has come online.


If you need an analytics script to know you're achieving your goals, you're probably setting the wrong goals.

What's the true measure of your success? Why are you writing? What problem is your work solving? Measure those outcomes.

If it's not possible to quantify those outputs, don't sweat it. Learn to qualify them instead. This may sound naive in our data-driven tech culture, but there's a feeling you get with experience when you know something is working. You learn to assess quality; you cultivate good judgement; and you can communicate why.

Caring about quality is the heart of craftsmanship. Until you're hooked into those outcomes, micro-optimizing the individual parts is pointless. From him who sees no wood for trees...

In a living system (which includes anything human-interfacing), there will always be interdependencies that are unknowable. A wholistic approach measures the inputs and outputs of the whole system. This becomes the basis for experimentation; how we learn to develop systems that support the growth of relationships.

It's a different perspective. I believe it's a more sustainable approach to growth.

Thanks for reading! I'm a product engineer, writing about how to be a human in a computer world. This site is my digital garden. Explore, enjoy. My mailbox is always open.

Return to my garden