From Clicks to Decisions
"Behavioral analytics sits at the intersection of engineering, product management, marketing, and ethics. The same infrastructure that helps a developer discover a confusing checkout flow can also enable surveillance advertising."
CSE 135 — Full Overview
The shared vocabulary for measuring user activity — from raw hits to actionable segments.
| Level | What It Is | Watch Out |
|---|---|---|
| Hit | Any server request (1 page = dozens of hits) | Useless for behavior; useful for capacity |
| Pageview | One HTML document load | SPAs break this; bots inflate it |
| Event | Discrete user action within a page | GA4 treats everything as events — including pageviews |
| Session | Group of interactions; 30 min timeout | Timeout is convention, not law; can be tuned to shape narrative |
| User | An identifier (cookie, login, fingerprint) | Same person = 2–4 "unique users" across devices |
| Segment | Filtered subset by traits | Aggregate numbers hide the story; segments reveal it |
Activity data is meaningless without outcomes — the conversions and KPIs that give it purpose:
data !== truth.
Measuring the quality of a visit, not just the quantity.
| Metric | Definition | Measurement |
|---|---|---|
| Scroll Depth | How far down the page a user scrolls (25/50/75/100%) | Intersection Observer API (async, performant) |
| Time on Page | Gap between consecutive pageview timestamps | Undefined for the last page in a session |
| Dwell Time | SERP click to SERP return | Search engine only — not available to site owners |
| Attention Time | Page visible + user active | Page Visibility API + interaction heartbeat |
Bounce rate = percentage of single-page sessions with no further interaction.
Single-pageview session = bounce. Simple count.
Bounce = 1 − engagement rate. Engaged = >10s, or conversion, or 2+ pageviews.
Same user behavior, different bounce rates depending on tool.
Conversions, KPIs, vanity metrics — and how organizations game them.
| Vanity Metric | Actionable KPI |
|---|---|
| Total pageviews | Pageviews per session |
| Total registered users | Monthly active users (MAU) |
| Social media followers | Conversion rate from social |
| App downloads | Day-7 retention rate |
| Email list size | Email click-through rate |
A dark pattern is UI design that deliberately manipulates users into unintended actions.
| Pattern | How It Works | Metric Inflated |
|---|---|---|
| Confirmshaming | "No thanks, I don't want to save money" | Email signups |
| Roach motel | Easy signup, impossible cancellation | Registered users, retention |
| Sneak into basket | Pre-added insurance/warranties in cart | Average order value |
| Disguised ads | Ads styled as content or download buttons | Click-through rate |
| Infinite scroll | Unnecessary pagination / content splitting | Pageviews, time on site |
Who gets credit for a conversion? Everyone claims it; nobody deserves all of it.
User journey: Social → Search → Email → Paid Ad → Conversion
The ~85% monopoly and the script injection layer that feeds it.
Google is simultaneously the largest ad seller, the dominant analytics platform, and the dominant search engine. Each role creates conflicts with the others.
The scientific method applied to web design.
JS modifies DOM after load. Flicker problem: original flashes before variant appears.
Server renders variant before sending HTML. No flicker, full control, requires dev integration.
Multivariate testing (MVT): Test multiple elements simultaneously (3 headlines × 2 images × 2 buttons = 12 combinations). Requires enormous traffic. Impractical for most sites.
Watching what users do (DOM reconstruction, not video) and measuring UX at scale.
Fidelity challenges: CSS-in-JS (runtime classes), Shadow DOM (invisible to MutationObserver), Canvas/WebGL (no DOM to capture), cross-origin iframes (sandboxed).
Google's structured approach to measuring UX at scale:
| Dimension | What It Measures | Example Metric |
|---|---|---|
| Happiness | Satisfaction, attitudes | CSAT score after task |
| Engagement | Depth & frequency of use | 7-day active users / total |
| Adoption | New users picking up features | % tried feature X in 7 days |
| Retention | Users coming back | Day-30 retention rate |
| Task Success | Can users accomplish goals? | Checkout completion rate |
The qualitative "why" that analytics cannot provide — and the ways we misread the data we do have.
Strongest insights come from combining quadrants. Analytics alone knows what but not why.
A metric can go up in every segment but down overall:
| Segment | Week 1 | Week 2 | Trend |
|---|---|---|---|
| Mobile (small → large volume) | 2% of 1,000 | 3% of 8,000 | Up |
| Desktop (large → small volume) | 8% of 9,000 | 9% of 2,000 | Up |
| Overall | 7.4% | 3.6% | Down |
The blind spots baked into every behavioral analytics system.
Gov.uk study: ~1.1% of users did not receive JS-enhanced pages. The causes:
| Cause | Nature |
|---|---|
| Network interruption (JS failed to download) | Delivery failure (most common) |
| Corporate proxy/firewall stripping scripts | Delivery failure |
| Prior script error breaking subsequent scripts | Cascading failure |
| Browser extension blocking (ad blocker) | Deliberate (~15–30% desktop) |
| User deliberately disabled JS | Negligible (<0.1%) |
From consent banners to data warehouses to surveillance profiles.
Dark pattern consent banners:
| Goal | Data Needed | Tension |
|---|---|---|
| Page popularity | URL + count | Low |
| User journeys | Session-level sequence | Medium |
| Cross-session behavior | Persistent user ID | High |
| Session replay | Full DOM + interactions | Very high |
Practical choices today — and why the entire model may be changing.
| Factor | Simple / Privacy | Product Analytics | Enterprise |
|---|---|---|---|
| Tools | Plausible, Fathom | PostHog, Amplitude | Adobe, GA4 + BigQuery |
| Budget | $0–$20/mo | $0–$2K/mo | $10K+/mo |
| Setup | Single script tag | Event taxonomy + SDK | Warehouse + ETL + team |
| GDPR | Cookie-free, no consent | Consent required | DPA + legal + CMP |
| Key strength | Pageviews, referrers | Funnels, cohorts, retention | Attribution, segmentation |
sendBeacon() silently drops oversized payloads. Building a sessionizer exposes the arbitrary 30-minute timeout. Building a dashboard shows how easy it is to present misleading aggregations.
Key takeaways from behavioral analytics.
| Topic | Takeaway |
|---|---|
| Visitation Model | Hits → pageviews → events → sessions → users → segments. "Unique user" is a fiction. |
| Engagement | Scroll depth, time, clicks measure quality — but high engagement can mean frustration |
| Outcomes | KPIs connect behavior to value; vanity metrics and dark patterns corrupt the feedback loop |
| Attribution | All models are simplifications. Walled gardens double-count. Incrementality testing is the only causal answer. |
| GA & Tag Managers | ~85% monopoly; "free" = you train ad models. TMS adds performance/security risk. |
| A/B Testing | Scientific method for the web — but requires statistical rigor most sites lack |
| Replay & Usability | DOM reconstruction, not video. HEART framework. Analytics detects; qualitative research diagnoses. |
| VoC & Pitfalls | Combine quant + qual. Single-metric thinking, Simpson's paradox, averages — the default is to misread data. |
| Data Quality | JS delivery failures, consent bias, identity resolution — every source has systematic blind spots |
| Privacy & Abuse | Gentle slope from analytics to surveillance. Each step individually justifiable; cumulative result is insidious. |
| LLMs & Zero-Click | When answers happen off-site, visit-based analytics has a structural blind spot for your most valuable audience |